sha
stringlengths
40
40
text
stringlengths
1
13.4M
id
stringlengths
2
117
tags
sequencelengths
1
7.91k
created_at
stringlengths
25
25
metadata
stringlengths
2
875k
last_modified
stringlengths
25
25
arxiv
sequencelengths
0
25
languages
sequencelengths
0
7.91k
tags_str
stringlengths
17
159k
text_str
stringlengths
1
447k
text_lists
sequencelengths
0
352
processed_texts
sequencelengths
1
353
4527e148805729f8fcd7bc0b2cd712aa9af8e80d
# Dataset Card for Evaluation run of Gille/MoE-StrangeMerges-2x7B <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [Gille/MoE-StrangeMerges-2x7B](https://huggingface.co/Gille/MoE-StrangeMerges-2x7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_Gille__MoE-StrangeMerges-2x7B", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-02-01T18:10:21.026862](https://huggingface.co/datasets/open-llm-leaderboard/details_Gille__MoE-StrangeMerges-2x7B/blob/main/results_2024-02-01T18-10-21.026862.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.655278221363272, "acc_stderr": 0.03204450170239166, "acc_norm": 0.6552443381647025, "acc_norm_stderr": 0.03270643907926025, "mc1": 0.5140758873929009, "mc1_stderr": 0.01749656371704278, "mc2": 0.6586163976185392, "mc2_stderr": 0.015149576465565402 }, "harness|arc:challenge|25": { "acc": 0.6868600682593856, "acc_stderr": 0.013552671543623501, "acc_norm": 0.7081911262798635, "acc_norm_stderr": 0.013284525292403518 }, "harness|hellaswag|10": { "acc": 0.7072296355307708, "acc_stderr": 0.004541039698729832, "acc_norm": 0.8783110934076878, "acc_norm_stderr": 0.0032625801905118586 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.31, "acc_stderr": 0.04648231987117316, "acc_norm": 0.31, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.6592592592592592, "acc_stderr": 0.040943762699967926, "acc_norm": 0.6592592592592592, "acc_norm_stderr": 0.040943762699967926 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.7105263157894737, "acc_stderr": 0.03690677986137283, "acc_norm": 0.7105263157894737, "acc_norm_stderr": 0.03690677986137283 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.65, "acc_stderr": 0.0479372485441102, "acc_norm": 0.65, "acc_norm_stderr": 0.0479372485441102 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.7132075471698113, "acc_stderr": 0.02783491252754406, "acc_norm": 0.7132075471698113, "acc_norm_stderr": 0.02783491252754406 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.7708333333333334, "acc_stderr": 0.03514697467862388, "acc_norm": 0.7708333333333334, "acc_norm_stderr": 0.03514697467862388 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.54, "acc_stderr": 0.05009082659620332, "acc_norm": 0.54, "acc_norm_stderr": 0.05009082659620332 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.53, "acc_stderr": 0.050161355804659205, "acc_norm": 0.53, "acc_norm_stderr": 0.050161355804659205 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.34, "acc_stderr": 0.04760952285695235, "acc_norm": 0.34, "acc_norm_stderr": 0.04760952285695235 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6763005780346821, "acc_stderr": 0.035676037996391706, "acc_norm": 0.6763005780346821, "acc_norm_stderr": 0.035676037996391706 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.43137254901960786, "acc_stderr": 0.04928099597287534, "acc_norm": 0.43137254901960786, "acc_norm_stderr": 0.04928099597287534 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.74, "acc_stderr": 0.04408440022768078, "acc_norm": 0.74, "acc_norm_stderr": 0.04408440022768078 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.5872340425531914, "acc_stderr": 0.03218471141400351, "acc_norm": 0.5872340425531914, "acc_norm_stderr": 0.03218471141400351 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.4824561403508772, "acc_stderr": 0.04700708033551038, "acc_norm": 0.4824561403508772, "acc_norm_stderr": 0.04700708033551038 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5655172413793104, "acc_stderr": 0.04130740879555498, "acc_norm": 0.5655172413793104, "acc_norm_stderr": 0.04130740879555498 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.40476190476190477, "acc_stderr": 0.025279850397404904, "acc_norm": 0.40476190476190477, "acc_norm_stderr": 0.025279850397404904 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.46825396825396826, "acc_stderr": 0.04463112720677171, "acc_norm": 0.46825396825396826, "acc_norm_stderr": 0.04463112720677171 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.35, "acc_stderr": 0.04793724854411019, "acc_norm": 0.35, "acc_norm_stderr": 0.04793724854411019 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.7870967741935484, "acc_stderr": 0.023287665127268545, "acc_norm": 0.7870967741935484, "acc_norm_stderr": 0.023287665127268545 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.4975369458128079, "acc_stderr": 0.03517945038691063, "acc_norm": 0.4975369458128079, "acc_norm_stderr": 0.03517945038691063 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.69, "acc_stderr": 0.04648231987117316, "acc_norm": 0.69, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7757575757575758, "acc_stderr": 0.03256866661681102, "acc_norm": 0.7757575757575758, "acc_norm_stderr": 0.03256866661681102 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.8232323232323232, "acc_stderr": 0.027178752639044915, "acc_norm": 0.8232323232323232, "acc_norm_stderr": 0.027178752639044915 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.9119170984455959, "acc_stderr": 0.02045374660160103, "acc_norm": 0.9119170984455959, "acc_norm_stderr": 0.02045374660160103 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.6717948717948717, "acc_stderr": 0.023807633198657266, "acc_norm": 0.6717948717948717, "acc_norm_stderr": 0.023807633198657266 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.3592592592592593, "acc_stderr": 0.029252905927251972, "acc_norm": 0.3592592592592593, "acc_norm_stderr": 0.029252905927251972 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.6722689075630253, "acc_stderr": 0.03048991141767323, "acc_norm": 0.6722689075630253, "acc_norm_stderr": 0.03048991141767323 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.36423841059602646, "acc_stderr": 0.03929111781242742, "acc_norm": 0.36423841059602646, "acc_norm_stderr": 0.03929111781242742 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8495412844036697, "acc_stderr": 0.015328563932669235, "acc_norm": 0.8495412844036697, "acc_norm_stderr": 0.015328563932669235 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.5185185185185185, "acc_stderr": 0.03407632093854051, "acc_norm": 0.5185185185185185, "acc_norm_stderr": 0.03407632093854051 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.8382352941176471, "acc_stderr": 0.025845017986926917, "acc_norm": 0.8382352941176471, "acc_norm_stderr": 0.025845017986926917 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.7932489451476793, "acc_stderr": 0.0263616516683891, "acc_norm": 0.7932489451476793, "acc_norm_stderr": 0.0263616516683891 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6860986547085202, "acc_stderr": 0.031146796482972465, "acc_norm": 0.6860986547085202, "acc_norm_stderr": 0.031146796482972465 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.7938931297709924, "acc_stderr": 0.03547771004159465, "acc_norm": 0.7938931297709924, "acc_norm_stderr": 0.03547771004159465 }, "harness|hendrycksTest-international_law|5": { "acc": 0.7768595041322314, "acc_stderr": 0.03800754475228732, "acc_norm": 0.7768595041322314, "acc_norm_stderr": 0.03800754475228732 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7777777777777778, "acc_stderr": 0.040191074725573483, "acc_norm": 0.7777777777777778, "acc_norm_stderr": 0.040191074725573483 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7423312883435583, "acc_stderr": 0.03436150827846917, "acc_norm": 0.7423312883435583, "acc_norm_stderr": 0.03436150827846917 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.44642857142857145, "acc_stderr": 0.04718471485219588, "acc_norm": 0.44642857142857145, "acc_norm_stderr": 0.04718471485219588 }, "harness|hendrycksTest-management|5": { "acc": 0.7864077669902912, "acc_stderr": 0.040580420156460344, "acc_norm": 0.7864077669902912, "acc_norm_stderr": 0.040580420156460344 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8846153846153846, "acc_stderr": 0.02093019318517933, "acc_norm": 0.8846153846153846, "acc_norm_stderr": 0.02093019318517933 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.7, "acc_stderr": 0.046056618647183814, "acc_norm": 0.7, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8326947637292464, "acc_stderr": 0.013347327202920332, "acc_norm": 0.8326947637292464, "acc_norm_stderr": 0.013347327202920332 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.7341040462427746, "acc_stderr": 0.02378620325550829, "acc_norm": 0.7341040462427746, "acc_norm_stderr": 0.02378620325550829 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.42793296089385474, "acc_stderr": 0.01654788799741611, "acc_norm": 0.42793296089385474, "acc_norm_stderr": 0.01654788799741611 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.7189542483660131, "acc_stderr": 0.025738854797818733, "acc_norm": 0.7189542483660131, "acc_norm_stderr": 0.025738854797818733 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.7009646302250804, "acc_stderr": 0.02600330111788514, "acc_norm": 0.7009646302250804, "acc_norm_stderr": 0.02600330111788514 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.7469135802469136, "acc_stderr": 0.024191808600712995, "acc_norm": 0.7469135802469136, "acc_norm_stderr": 0.024191808600712995 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.4858156028368794, "acc_stderr": 0.02981549448368206, "acc_norm": 0.4858156028368794, "acc_norm_stderr": 0.02981549448368206 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.4654498044328553, "acc_stderr": 0.0127397115540457, "acc_norm": 0.4654498044328553, "acc_norm_stderr": 0.0127397115540457 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.6875, "acc_stderr": 0.02815637344037142, "acc_norm": 0.6875, "acc_norm_stderr": 0.02815637344037142 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6781045751633987, "acc_stderr": 0.018901015322093092, "acc_norm": 0.6781045751633987, "acc_norm_stderr": 0.018901015322093092 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6727272727272727, "acc_stderr": 0.0449429086625209, "acc_norm": 0.6727272727272727, "acc_norm_stderr": 0.0449429086625209 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.7387755102040816, "acc_stderr": 0.028123429335142777, "acc_norm": 0.7387755102040816, "acc_norm_stderr": 0.028123429335142777 }, "harness|hendrycksTest-sociology|5": { "acc": 0.8557213930348259, "acc_stderr": 0.024845753212306046, "acc_norm": 0.8557213930348259, "acc_norm_stderr": 0.024845753212306046 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.86, "acc_stderr": 0.0348735088019777, "acc_norm": 0.86, "acc_norm_stderr": 0.0348735088019777 }, "harness|hendrycksTest-virology|5": { "acc": 0.5542168674698795, "acc_stderr": 0.03869543323472101, "acc_norm": 0.5542168674698795, "acc_norm_stderr": 0.03869543323472101 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8362573099415205, "acc_stderr": 0.028380919596145866, "acc_norm": 0.8362573099415205, "acc_norm_stderr": 0.028380919596145866 }, "harness|truthfulqa:mc|0": { "mc1": 0.5140758873929009, "mc1_stderr": 0.01749656371704278, "mc2": 0.6586163976185392, "mc2_stderr": 0.015149576465565402 }, "harness|winogrande|5": { "acc": 0.8279400157853196, "acc_stderr": 0.010607731615247015 }, "harness|gsm8k|5": { "acc": 0.6770280515542078, "acc_stderr": 0.012880360794851806 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_Gille__MoE-StrangeMerges-2x7B
[ "region:us" ]
2024-02-01T18:12:40+00:00
{"pretty_name": "Evaluation run of Gille/MoE-StrangeMerges-2x7B", "dataset_summary": "Dataset automatically created during the evaluation run of model [Gille/MoE-StrangeMerges-2x7B](https://huggingface.co/Gille/MoE-StrangeMerges-2x7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Gille__MoE-StrangeMerges-2x7B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-01T18:10:21.026862](https://huggingface.co/datasets/open-llm-leaderboard/details_Gille__MoE-StrangeMerges-2x7B/blob/main/results_2024-02-01T18-10-21.026862.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.655278221363272,\n \"acc_stderr\": 0.03204450170239166,\n \"acc_norm\": 0.6552443381647025,\n \"acc_norm_stderr\": 0.03270643907926025,\n \"mc1\": 0.5140758873929009,\n \"mc1_stderr\": 0.01749656371704278,\n \"mc2\": 0.6586163976185392,\n \"mc2_stderr\": 0.015149576465565402\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6868600682593856,\n \"acc_stderr\": 0.013552671543623501,\n \"acc_norm\": 0.7081911262798635,\n \"acc_norm_stderr\": 0.013284525292403518\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7072296355307708,\n \"acc_stderr\": 0.004541039698729832,\n \"acc_norm\": 0.8783110934076878,\n \"acc_norm_stderr\": 0.0032625801905118586\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6592592592592592,\n \"acc_stderr\": 0.040943762699967926,\n \"acc_norm\": 0.6592592592592592,\n \"acc_norm_stderr\": 0.040943762699967926\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7105263157894737,\n \"acc_stderr\": 0.03690677986137283,\n \"acc_norm\": 0.7105263157894737,\n \"acc_norm_stderr\": 0.03690677986137283\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.65,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7132075471698113,\n \"acc_stderr\": 0.02783491252754406,\n \"acc_norm\": 0.7132075471698113,\n \"acc_norm_stderr\": 0.02783491252754406\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7708333333333334,\n \"acc_stderr\": 0.03514697467862388,\n \"acc_norm\": 0.7708333333333334,\n \"acc_norm_stderr\": 0.03514697467862388\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.53,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6763005780346821,\n \"acc_stderr\": 0.035676037996391706,\n \"acc_norm\": 0.6763005780346821,\n \"acc_norm_stderr\": 0.035676037996391706\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.43137254901960786,\n \"acc_stderr\": 0.04928099597287534,\n \"acc_norm\": 0.43137254901960786,\n \"acc_norm_stderr\": 0.04928099597287534\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5872340425531914,\n \"acc_stderr\": 0.03218471141400351,\n \"acc_norm\": 0.5872340425531914,\n \"acc_norm_stderr\": 0.03218471141400351\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.4824561403508772,\n \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5655172413793104,\n \"acc_stderr\": 0.04130740879555498,\n \"acc_norm\": 0.5655172413793104,\n \"acc_norm_stderr\": 0.04130740879555498\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.40476190476190477,\n \"acc_stderr\": 0.025279850397404904,\n \"acc_norm\": 0.40476190476190477,\n \"acc_norm_stderr\": 0.025279850397404904\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.46825396825396826,\n \"acc_stderr\": 0.04463112720677171,\n \"acc_norm\": 0.46825396825396826,\n \"acc_norm_stderr\": 0.04463112720677171\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.04793724854411019,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.04793724854411019\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7870967741935484,\n \"acc_stderr\": 0.023287665127268545,\n \"acc_norm\": 0.7870967741935484,\n \"acc_norm_stderr\": 0.023287665127268545\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4975369458128079,\n \"acc_stderr\": 0.03517945038691063,\n \"acc_norm\": 0.4975369458128079,\n \"acc_norm_stderr\": 0.03517945038691063\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.03256866661681102,\n \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.03256866661681102\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.8232323232323232,\n \"acc_stderr\": 0.027178752639044915,\n \"acc_norm\": 0.8232323232323232,\n \"acc_norm_stderr\": 0.027178752639044915\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9119170984455959,\n \"acc_stderr\": 0.02045374660160103,\n \"acc_norm\": 0.9119170984455959,\n \"acc_norm_stderr\": 0.02045374660160103\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6717948717948717,\n \"acc_stderr\": 0.023807633198657266,\n \"acc_norm\": 0.6717948717948717,\n \"acc_norm_stderr\": 0.023807633198657266\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3592592592592593,\n \"acc_stderr\": 0.029252905927251972,\n \"acc_norm\": 0.3592592592592593,\n \"acc_norm_stderr\": 0.029252905927251972\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6722689075630253,\n \"acc_stderr\": 0.03048991141767323,\n \"acc_norm\": 0.6722689075630253,\n \"acc_norm_stderr\": 0.03048991141767323\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.36423841059602646,\n \"acc_stderr\": 0.03929111781242742,\n \"acc_norm\": 0.36423841059602646,\n \"acc_norm_stderr\": 0.03929111781242742\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8495412844036697,\n \"acc_stderr\": 0.015328563932669235,\n \"acc_norm\": 0.8495412844036697,\n \"acc_norm_stderr\": 0.015328563932669235\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5185185185185185,\n \"acc_stderr\": 0.03407632093854051,\n \"acc_norm\": 0.5185185185185185,\n \"acc_norm_stderr\": 0.03407632093854051\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8382352941176471,\n \"acc_stderr\": 0.025845017986926917,\n \"acc_norm\": 0.8382352941176471,\n \"acc_norm_stderr\": 0.025845017986926917\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7932489451476793,\n \"acc_stderr\": 0.0263616516683891,\n \"acc_norm\": 0.7932489451476793,\n \"acc_norm_stderr\": 0.0263616516683891\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7938931297709924,\n \"acc_stderr\": 0.03547771004159465,\n \"acc_norm\": 0.7938931297709924,\n \"acc_norm_stderr\": 0.03547771004159465\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228732,\n \"acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228732\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.040191074725573483,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.040191074725573483\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7423312883435583,\n \"acc_stderr\": 0.03436150827846917,\n \"acc_norm\": 0.7423312883435583,\n \"acc_norm_stderr\": 0.03436150827846917\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.44642857142857145,\n \"acc_stderr\": 0.04718471485219588,\n \"acc_norm\": 0.44642857142857145,\n \"acc_norm_stderr\": 0.04718471485219588\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7864077669902912,\n \"acc_stderr\": 0.040580420156460344,\n \"acc_norm\": 0.7864077669902912,\n \"acc_norm_stderr\": 0.040580420156460344\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8846153846153846,\n \"acc_stderr\": 0.02093019318517933,\n \"acc_norm\": 0.8846153846153846,\n \"acc_norm_stderr\": 0.02093019318517933\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8326947637292464,\n \"acc_stderr\": 0.013347327202920332,\n \"acc_norm\": 0.8326947637292464,\n \"acc_norm_stderr\": 0.013347327202920332\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7341040462427746,\n \"acc_stderr\": 0.02378620325550829,\n \"acc_norm\": 0.7341040462427746,\n \"acc_norm_stderr\": 0.02378620325550829\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.42793296089385474,\n \"acc_stderr\": 0.01654788799741611,\n \"acc_norm\": 0.42793296089385474,\n \"acc_norm_stderr\": 0.01654788799741611\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7189542483660131,\n \"acc_stderr\": 0.025738854797818733,\n \"acc_norm\": 0.7189542483660131,\n \"acc_norm_stderr\": 0.025738854797818733\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7009646302250804,\n \"acc_stderr\": 0.02600330111788514,\n \"acc_norm\": 0.7009646302250804,\n \"acc_norm_stderr\": 0.02600330111788514\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7469135802469136,\n \"acc_stderr\": 0.024191808600712995,\n \"acc_norm\": 0.7469135802469136,\n \"acc_norm_stderr\": 0.024191808600712995\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4858156028368794,\n \"acc_stderr\": 0.02981549448368206,\n \"acc_norm\": 0.4858156028368794,\n \"acc_norm_stderr\": 0.02981549448368206\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4654498044328553,\n \"acc_stderr\": 0.0127397115540457,\n \"acc_norm\": 0.4654498044328553,\n \"acc_norm_stderr\": 0.0127397115540457\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6875,\n \"acc_stderr\": 0.02815637344037142,\n \"acc_norm\": 0.6875,\n \"acc_norm_stderr\": 0.02815637344037142\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6781045751633987,\n \"acc_stderr\": 0.018901015322093092,\n \"acc_norm\": 0.6781045751633987,\n \"acc_norm_stderr\": 0.018901015322093092\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7387755102040816,\n \"acc_stderr\": 0.028123429335142777,\n \"acc_norm\": 0.7387755102040816,\n \"acc_norm_stderr\": 0.028123429335142777\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8557213930348259,\n \"acc_stderr\": 0.024845753212306046,\n \"acc_norm\": 0.8557213930348259,\n \"acc_norm_stderr\": 0.024845753212306046\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.86,\n \"acc_stderr\": 0.0348735088019777,\n \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.0348735088019777\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5542168674698795,\n \"acc_stderr\": 0.03869543323472101,\n \"acc_norm\": 0.5542168674698795,\n \"acc_norm_stderr\": 0.03869543323472101\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5140758873929009,\n \"mc1_stderr\": 0.01749656371704278,\n \"mc2\": 0.6586163976185392,\n \"mc2_stderr\": 0.015149576465565402\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8279400157853196,\n \"acc_stderr\": 0.010607731615247015\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6770280515542078,\n \"acc_stderr\": 0.012880360794851806\n }\n}\n```", "repo_url": "https://huggingface.co/Gille/MoE-StrangeMerges-2x7B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_01T18_10_21.026862", "path": ["**/details_harness|arc:challenge|25_2024-02-01T18-10-21.026862.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-01T18-10-21.026862.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_01T18_10_21.026862", "path": ["**/details_harness|gsm8k|5_2024-02-01T18-10-21.026862.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-01T18-10-21.026862.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_01T18_10_21.026862", "path": ["**/details_harness|hellaswag|10_2024-02-01T18-10-21.026862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-01T18-10-21.026862.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_01T18_10_21.026862", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T18-10-21.026862.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-01T18-10-21.026862.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-01T18-10-21.026862.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T18-10-21.026862.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T18-10-21.026862.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-01T18-10-21.026862.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T18-10-21.026862.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T18-10-21.026862.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T18-10-21.026862.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T18-10-21.026862.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-01T18-10-21.026862.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-01T18-10-21.026862.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T18-10-21.026862.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-01T18-10-21.026862.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T18-10-21.026862.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T18-10-21.026862.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T18-10-21.026862.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-01T18-10-21.026862.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T18-10-21.026862.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T18-10-21.026862.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T18-10-21.026862.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T18-10-21.026862.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T18-10-21.026862.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T18-10-21.026862.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T18-10-21.026862.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T18-10-21.026862.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T18-10-21.026862.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T18-10-21.026862.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T18-10-21.026862.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T18-10-21.026862.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T18-10-21.026862.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T18-10-21.026862.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-01T18-10-21.026862.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T18-10-21.026862.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-01T18-10-21.026862.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T18-10-21.026862.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T18-10-21.026862.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T18-10-21.026862.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-01T18-10-21.026862.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-01T18-10-21.026862.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T18-10-21.026862.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T18-10-21.026862.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T18-10-21.026862.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T18-10-21.026862.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-01T18-10-21.026862.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-01T18-10-21.026862.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-01T18-10-21.026862.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T18-10-21.026862.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-01T18-10-21.026862.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T18-10-21.026862.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T18-10-21.026862.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-01T18-10-21.026862.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-01T18-10-21.026862.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-01T18-10-21.026862.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T18-10-21.026862.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-01T18-10-21.026862.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-01T18-10-21.026862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T18-10-21.026862.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-01T18-10-21.026862.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-01T18-10-21.026862.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T18-10-21.026862.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T18-10-21.026862.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-01T18-10-21.026862.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T18-10-21.026862.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T18-10-21.026862.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T18-10-21.026862.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T18-10-21.026862.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-01T18-10-21.026862.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-01T18-10-21.026862.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T18-10-21.026862.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-01T18-10-21.026862.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T18-10-21.026862.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T18-10-21.026862.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T18-10-21.026862.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-01T18-10-21.026862.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T18-10-21.026862.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T18-10-21.026862.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T18-10-21.026862.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T18-10-21.026862.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T18-10-21.026862.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T18-10-21.026862.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T18-10-21.026862.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T18-10-21.026862.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T18-10-21.026862.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T18-10-21.026862.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T18-10-21.026862.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T18-10-21.026862.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T18-10-21.026862.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T18-10-21.026862.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-01T18-10-21.026862.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T18-10-21.026862.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-01T18-10-21.026862.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T18-10-21.026862.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T18-10-21.026862.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T18-10-21.026862.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-01T18-10-21.026862.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-01T18-10-21.026862.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T18-10-21.026862.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T18-10-21.026862.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T18-10-21.026862.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T18-10-21.026862.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-01T18-10-21.026862.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-01T18-10-21.026862.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-01T18-10-21.026862.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T18-10-21.026862.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-01T18-10-21.026862.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T18-10-21.026862.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T18-10-21.026862.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-01T18-10-21.026862.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-01T18-10-21.026862.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-01T18-10-21.026862.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T18-10-21.026862.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-01T18-10-21.026862.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-01T18-10-21.026862.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_01T18_10_21.026862", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T18-10-21.026862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T18-10-21.026862.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_01T18_10_21.026862", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-01T18-10-21.026862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-01T18-10-21.026862.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_01T18_10_21.026862", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-01T18-10-21.026862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-01T18-10-21.026862.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_01T18_10_21.026862", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T18-10-21.026862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T18-10-21.026862.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_01T18_10_21.026862", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T18-10-21.026862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T18-10-21.026862.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_01T18_10_21.026862", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-01T18-10-21.026862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-01T18-10-21.026862.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_01T18_10_21.026862", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T18-10-21.026862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T18-10-21.026862.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_01T18_10_21.026862", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T18-10-21.026862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T18-10-21.026862.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_01T18_10_21.026862", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T18-10-21.026862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T18-10-21.026862.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_01T18_10_21.026862", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T18-10-21.026862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T18-10-21.026862.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_01T18_10_21.026862", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-01T18-10-21.026862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-01T18-10-21.026862.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_01T18_10_21.026862", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-01T18-10-21.026862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-01T18-10-21.026862.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_01T18_10_21.026862", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T18-10-21.026862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T18-10-21.026862.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_01T18_10_21.026862", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-01T18-10-21.026862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-01T18-10-21.026862.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_01T18_10_21.026862", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T18-10-21.026862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T18-10-21.026862.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_01T18_10_21.026862", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T18-10-21.026862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T18-10-21.026862.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_01T18_10_21.026862", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T18-10-21.026862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T18-10-21.026862.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_01T18_10_21.026862", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-01T18-10-21.026862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-01T18-10-21.026862.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_01T18_10_21.026862", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T18-10-21.026862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T18-10-21.026862.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_01T18_10_21.026862", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T18-10-21.026862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T18-10-21.026862.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_01T18_10_21.026862", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T18-10-21.026862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T18-10-21.026862.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_01T18_10_21.026862", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T18-10-21.026862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T18-10-21.026862.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_01T18_10_21.026862", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T18-10-21.026862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T18-10-21.026862.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_01T18_10_21.026862", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T18-10-21.026862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T18-10-21.026862.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_01T18_10_21.026862", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T18-10-21.026862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T18-10-21.026862.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_01T18_10_21.026862", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T18-10-21.026862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T18-10-21.026862.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_01T18_10_21.026862", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T18-10-21.026862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T18-10-21.026862.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_01T18_10_21.026862", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T18-10-21.026862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T18-10-21.026862.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_01T18_10_21.026862", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T18-10-21.026862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T18-10-21.026862.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_01T18_10_21.026862", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T18-10-21.026862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T18-10-21.026862.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_01T18_10_21.026862", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T18-10-21.026862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T18-10-21.026862.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_01T18_10_21.026862", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T18-10-21.026862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T18-10-21.026862.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_01T18_10_21.026862", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-01T18-10-21.026862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-01T18-10-21.026862.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_01T18_10_21.026862", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T18-10-21.026862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T18-10-21.026862.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_01T18_10_21.026862", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-01T18-10-21.026862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-01T18-10-21.026862.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_01T18_10_21.026862", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T18-10-21.026862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T18-10-21.026862.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_01T18_10_21.026862", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T18-10-21.026862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T18-10-21.026862.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_01T18_10_21.026862", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T18-10-21.026862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T18-10-21.026862.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_01T18_10_21.026862", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-01T18-10-21.026862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-01T18-10-21.026862.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_01T18_10_21.026862", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-01T18-10-21.026862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-01T18-10-21.026862.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_01T18_10_21.026862", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T18-10-21.026862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T18-10-21.026862.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_01T18_10_21.026862", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T18-10-21.026862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T18-10-21.026862.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_01T18_10_21.026862", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T18-10-21.026862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T18-10-21.026862.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_01T18_10_21.026862", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T18-10-21.026862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T18-10-21.026862.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_01T18_10_21.026862", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-01T18-10-21.026862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-01T18-10-21.026862.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_01T18_10_21.026862", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-01T18-10-21.026862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-01T18-10-21.026862.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_01T18_10_21.026862", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-01T18-10-21.026862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-01T18-10-21.026862.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_01T18_10_21.026862", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T18-10-21.026862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T18-10-21.026862.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_01T18_10_21.026862", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-01T18-10-21.026862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-01T18-10-21.026862.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_01T18_10_21.026862", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T18-10-21.026862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T18-10-21.026862.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_01T18_10_21.026862", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T18-10-21.026862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T18-10-21.026862.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_01T18_10_21.026862", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-01T18-10-21.026862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-01T18-10-21.026862.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_01T18_10_21.026862", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-01T18-10-21.026862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-01T18-10-21.026862.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_01T18_10_21.026862", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-01T18-10-21.026862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-01T18-10-21.026862.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_01T18_10_21.026862", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T18-10-21.026862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T18-10-21.026862.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_01T18_10_21.026862", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-01T18-10-21.026862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-01T18-10-21.026862.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_01T18_10_21.026862", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-01T18-10-21.026862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-01T18-10-21.026862.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_01T18_10_21.026862", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-01T18-10-21.026862.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-01T18-10-21.026862.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_01T18_10_21.026862", "path": ["**/details_harness|winogrande|5_2024-02-01T18-10-21.026862.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-01T18-10-21.026862.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_01T18_10_21.026862", "path": ["results_2024-02-01T18-10-21.026862.parquet"]}, {"split": "latest", "path": ["results_2024-02-01T18-10-21.026862.parquet"]}]}]}
2024-02-01T18:13:07+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of Gille/MoE-StrangeMerges-2x7B Dataset automatically created during the evaluation run of model Gille/MoE-StrangeMerges-2x7B on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-02-01T18:10:21.026862(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of Gille/MoE-StrangeMerges-2x7B\n\n\n\nDataset automatically created during the evaluation run of model Gille/MoE-StrangeMerges-2x7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-02-01T18:10:21.026862(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of Gille/MoE-StrangeMerges-2x7B\n\n\n\nDataset automatically created during the evaluation run of model Gille/MoE-StrangeMerges-2x7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-02-01T18:10:21.026862(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
73cf7731c8205b1509a56b1a595f2f3d6dd320bb
# Dataset Card for Evaluation run of yanolja/Bookworm-10.7B-v0.4-DPO <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [yanolja/Bookworm-10.7B-v0.4-DPO](https://huggingface.co/yanolja/Bookworm-10.7B-v0.4-DPO) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_yanolja__Bookworm-10.7B-v0.4-DPO", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-02-01T18:19:15.058025](https://huggingface.co/datasets/open-llm-leaderboard/details_yanolja__Bookworm-10.7B-v0.4-DPO/blob/main/results_2024-02-01T18-19-15.058025.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6512039470522575, "acc_stderr": 0.032016258824533204, "acc_norm": 0.6543530523533914, "acc_norm_stderr": 0.03265904724752235, "mc1": 0.36474908200734396, "mc1_stderr": 0.016850961061720116, "mc2": 0.5238117102691138, "mc2_stderr": 0.01570708203583901 }, "harness|arc:challenge|25": { "acc": 0.6177474402730375, "acc_stderr": 0.014200454049979282, "acc_norm": 0.6467576791808873, "acc_norm_stderr": 0.013967822714840056 }, "harness|hellaswag|10": { "acc": 0.656144194383589, "acc_stderr": 0.0047402292124734575, "acc_norm": 0.8442541326428998, "acc_norm_stderr": 0.0036187316588377092 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.33, "acc_stderr": 0.04725815626252606, "acc_norm": 0.33, "acc_norm_stderr": 0.04725815626252606 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.5555555555555556, "acc_stderr": 0.04292596718256981, "acc_norm": 0.5555555555555556, "acc_norm_stderr": 0.04292596718256981 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.7236842105263158, "acc_stderr": 0.03639057569952929, "acc_norm": 0.7236842105263158, "acc_norm_stderr": 0.03639057569952929 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.63, "acc_stderr": 0.04852365870939099, "acc_norm": 0.63, "acc_norm_stderr": 0.04852365870939099 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.720754716981132, "acc_stderr": 0.027611163402399715, "acc_norm": 0.720754716981132, "acc_norm_stderr": 0.027611163402399715 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.75, "acc_stderr": 0.03621034121889507, "acc_norm": 0.75, "acc_norm_stderr": 0.03621034121889507 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.46, "acc_stderr": 0.05009082659620333, "acc_norm": 0.46, "acc_norm_stderr": 0.05009082659620333 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.48, "acc_stderr": 0.050211673156867795, "acc_norm": 0.48, "acc_norm_stderr": 0.050211673156867795 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.4, "acc_stderr": 0.04923659639173309, "acc_norm": 0.4, "acc_norm_stderr": 0.04923659639173309 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6589595375722543, "acc_stderr": 0.03614665424180826, "acc_norm": 0.6589595375722543, "acc_norm_stderr": 0.03614665424180826 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.43137254901960786, "acc_stderr": 0.04928099597287534, "acc_norm": 0.43137254901960786, "acc_norm_stderr": 0.04928099597287534 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.74, "acc_stderr": 0.04408440022768078, "acc_norm": 0.74, "acc_norm_stderr": 0.04408440022768078 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.5659574468085107, "acc_stderr": 0.03240038086792747, "acc_norm": 0.5659574468085107, "acc_norm_stderr": 0.03240038086792747 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.4473684210526316, "acc_stderr": 0.04677473004491199, "acc_norm": 0.4473684210526316, "acc_norm_stderr": 0.04677473004491199 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.593103448275862, "acc_stderr": 0.04093793981266236, "acc_norm": 0.593103448275862, "acc_norm_stderr": 0.04093793981266236 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.4523809523809524, "acc_stderr": 0.02563425811555495, "acc_norm": 0.4523809523809524, "acc_norm_stderr": 0.02563425811555495 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.48412698412698413, "acc_stderr": 0.04469881854072606, "acc_norm": 0.48412698412698413, "acc_norm_stderr": 0.04469881854072606 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.37, "acc_stderr": 0.04852365870939099, "acc_norm": 0.37, "acc_norm_stderr": 0.04852365870939099 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.7935483870967742, "acc_stderr": 0.02302589961718872, "acc_norm": 0.7935483870967742, "acc_norm_stderr": 0.02302589961718872 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.458128078817734, "acc_stderr": 0.03505630140785741, "acc_norm": 0.458128078817734, "acc_norm_stderr": 0.03505630140785741 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.72, "acc_stderr": 0.04512608598542127, "acc_norm": 0.72, "acc_norm_stderr": 0.04512608598542127 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.8, "acc_stderr": 0.031234752377721175, "acc_norm": 0.8, "acc_norm_stderr": 0.031234752377721175 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.8282828282828283, "acc_stderr": 0.026869716187429903, "acc_norm": 0.8282828282828283, "acc_norm_stderr": 0.026869716187429903 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.8963730569948186, "acc_stderr": 0.021995311963644234, "acc_norm": 0.8963730569948186, "acc_norm_stderr": 0.021995311963644234 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.6205128205128205, "acc_stderr": 0.024603626924097413, "acc_norm": 0.6205128205128205, "acc_norm_stderr": 0.024603626924097413 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.3925925925925926, "acc_stderr": 0.029773847012532967, "acc_norm": 0.3925925925925926, "acc_norm_stderr": 0.029773847012532967 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.7058823529411765, "acc_stderr": 0.029597329730978082, "acc_norm": 0.7058823529411765, "acc_norm_stderr": 0.029597329730978082 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.37748344370860926, "acc_stderr": 0.0395802723112157, "acc_norm": 0.37748344370860926, "acc_norm_stderr": 0.0395802723112157 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8366972477064221, "acc_stderr": 0.015848255806501562, "acc_norm": 0.8366972477064221, "acc_norm_stderr": 0.015848255806501562 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.5555555555555556, "acc_stderr": 0.03388857118502325, "acc_norm": 0.5555555555555556, "acc_norm_stderr": 0.03388857118502325 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.8529411764705882, "acc_stderr": 0.02485747808025046, "acc_norm": 0.8529411764705882, "acc_norm_stderr": 0.02485747808025046 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.8607594936708861, "acc_stderr": 0.022535526352692705, "acc_norm": 0.8607594936708861, "acc_norm_stderr": 0.022535526352692705 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.7085201793721974, "acc_stderr": 0.03050028317654585, "acc_norm": 0.7085201793721974, "acc_norm_stderr": 0.03050028317654585 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.7022900763358778, "acc_stderr": 0.040103589424622034, "acc_norm": 0.7022900763358778, "acc_norm_stderr": 0.040103589424622034 }, "harness|hendrycksTest-international_law|5": { "acc": 0.7933884297520661, "acc_stderr": 0.03695980128098824, "acc_norm": 0.7933884297520661, "acc_norm_stderr": 0.03695980128098824 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7962962962962963, "acc_stderr": 0.03893542518824846, "acc_norm": 0.7962962962962963, "acc_norm_stderr": 0.03893542518824846 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7361963190184049, "acc_stderr": 0.03462419931615623, "acc_norm": 0.7361963190184049, "acc_norm_stderr": 0.03462419931615623 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.4375, "acc_stderr": 0.04708567521880525, "acc_norm": 0.4375, "acc_norm_stderr": 0.04708567521880525 }, "harness|hendrycksTest-management|5": { "acc": 0.8058252427184466, "acc_stderr": 0.03916667762822584, "acc_norm": 0.8058252427184466, "acc_norm_stderr": 0.03916667762822584 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8803418803418803, "acc_stderr": 0.021262719400406957, "acc_norm": 0.8803418803418803, "acc_norm_stderr": 0.021262719400406957 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.7, "acc_stderr": 0.046056618647183814, "acc_norm": 0.7, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8186462324393359, "acc_stderr": 0.013778693778464078, "acc_norm": 0.8186462324393359, "acc_norm_stderr": 0.013778693778464078 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.7312138728323699, "acc_stderr": 0.02386800326250011, "acc_norm": 0.7312138728323699, "acc_norm_stderr": 0.02386800326250011 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.4, "acc_stderr": 0.016384638410380823, "acc_norm": 0.4, "acc_norm_stderr": 0.016384638410380823 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.7418300653594772, "acc_stderr": 0.02505850331695814, "acc_norm": 0.7418300653594772, "acc_norm_stderr": 0.02505850331695814 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.7041800643086816, "acc_stderr": 0.025922371788818763, "acc_norm": 0.7041800643086816, "acc_norm_stderr": 0.025922371788818763 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.7407407407407407, "acc_stderr": 0.02438366553103545, "acc_norm": 0.7407407407407407, "acc_norm_stderr": 0.02438366553103545 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.5070921985815603, "acc_stderr": 0.02982449855912901, "acc_norm": 0.5070921985815603, "acc_norm_stderr": 0.02982449855912901 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.485006518904824, "acc_stderr": 0.01276449320219326, "acc_norm": 0.485006518904824, "acc_norm_stderr": 0.01276449320219326 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.6911764705882353, "acc_stderr": 0.028064998167040094, "acc_norm": 0.6911764705882353, "acc_norm_stderr": 0.028064998167040094 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6503267973856209, "acc_stderr": 0.019291961895066375, "acc_norm": 0.6503267973856209, "acc_norm_stderr": 0.019291961895066375 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.7090909090909091, "acc_stderr": 0.04350271442923243, "acc_norm": 0.7090909090909091, "acc_norm_stderr": 0.04350271442923243 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.7591836734693878, "acc_stderr": 0.02737294220178816, "acc_norm": 0.7591836734693878, "acc_norm_stderr": 0.02737294220178816 }, "harness|hendrycksTest-sociology|5": { "acc": 0.835820895522388, "acc_stderr": 0.026193923544454115, "acc_norm": 0.835820895522388, "acc_norm_stderr": 0.026193923544454115 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.91, "acc_stderr": 0.028762349126466108, "acc_norm": 0.91, "acc_norm_stderr": 0.028762349126466108 }, "harness|hendrycksTest-virology|5": { "acc": 0.5662650602409639, "acc_stderr": 0.03858158940685515, "acc_norm": 0.5662650602409639, "acc_norm_stderr": 0.03858158940685515 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8128654970760234, "acc_stderr": 0.029913127232368053, "acc_norm": 0.8128654970760234, "acc_norm_stderr": 0.029913127232368053 }, "harness|truthfulqa:mc|0": { "mc1": 0.36474908200734396, "mc1_stderr": 0.016850961061720116, "mc2": 0.5238117102691138, "mc2_stderr": 0.01570708203583901 }, "harness|winogrande|5": { "acc": 0.8113654301499605, "acc_stderr": 0.0109951723180198 }, "harness|gsm8k|5": { "acc": 0.5223654283548143, "acc_stderr": 0.013758699485911838 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_yanolja__Bookworm-10.7B-v0.4-DPO
[ "region:us" ]
2024-02-01T18:18:32+00:00
{"pretty_name": "Evaluation run of yanolja/Bookworm-10.7B-v0.4-DPO", "dataset_summary": "Dataset automatically created during the evaluation run of model [yanolja/Bookworm-10.7B-v0.4-DPO](https://huggingface.co/yanolja/Bookworm-10.7B-v0.4-DPO) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_yanolja__Bookworm-10.7B-v0.4-DPO\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-01T18:19:15.058025](https://huggingface.co/datasets/open-llm-leaderboard/details_yanolja__Bookworm-10.7B-v0.4-DPO/blob/main/results_2024-02-01T18-19-15.058025.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6512039470522575,\n \"acc_stderr\": 0.032016258824533204,\n \"acc_norm\": 0.6543530523533914,\n \"acc_norm_stderr\": 0.03265904724752235,\n \"mc1\": 0.36474908200734396,\n \"mc1_stderr\": 0.016850961061720116,\n \"mc2\": 0.5238117102691138,\n \"mc2_stderr\": 0.01570708203583901\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6177474402730375,\n \"acc_stderr\": 0.014200454049979282,\n \"acc_norm\": 0.6467576791808873,\n \"acc_norm_stderr\": 0.013967822714840056\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.656144194383589,\n \"acc_stderr\": 0.0047402292124734575,\n \"acc_norm\": 0.8442541326428998,\n \"acc_norm_stderr\": 0.0036187316588377092\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252606,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252606\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5555555555555556,\n \"acc_stderr\": 0.04292596718256981,\n \"acc_norm\": 0.5555555555555556,\n \"acc_norm_stderr\": 0.04292596718256981\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7236842105263158,\n \"acc_stderr\": 0.03639057569952929,\n \"acc_norm\": 0.7236842105263158,\n \"acc_norm_stderr\": 0.03639057569952929\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.63,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.720754716981132,\n \"acc_stderr\": 0.027611163402399715,\n \"acc_norm\": 0.720754716981132,\n \"acc_norm_stderr\": 0.027611163402399715\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.03621034121889507,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.03621034121889507\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.4,\n \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6589595375722543,\n \"acc_stderr\": 0.03614665424180826,\n \"acc_norm\": 0.6589595375722543,\n \"acc_norm_stderr\": 0.03614665424180826\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.43137254901960786,\n \"acc_stderr\": 0.04928099597287534,\n \"acc_norm\": 0.43137254901960786,\n \"acc_norm_stderr\": 0.04928099597287534\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5659574468085107,\n \"acc_stderr\": 0.03240038086792747,\n \"acc_norm\": 0.5659574468085107,\n \"acc_norm_stderr\": 0.03240038086792747\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4473684210526316,\n \"acc_stderr\": 0.04677473004491199,\n \"acc_norm\": 0.4473684210526316,\n \"acc_norm_stderr\": 0.04677473004491199\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.593103448275862,\n \"acc_stderr\": 0.04093793981266236,\n \"acc_norm\": 0.593103448275862,\n \"acc_norm_stderr\": 0.04093793981266236\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.4523809523809524,\n \"acc_stderr\": 0.02563425811555495,\n \"acc_norm\": 0.4523809523809524,\n \"acc_norm_stderr\": 0.02563425811555495\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.48412698412698413,\n \"acc_stderr\": 0.04469881854072606,\n \"acc_norm\": 0.48412698412698413,\n \"acc_norm_stderr\": 0.04469881854072606\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7935483870967742,\n \"acc_stderr\": 0.02302589961718872,\n \"acc_norm\": 0.7935483870967742,\n \"acc_norm_stderr\": 0.02302589961718872\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.458128078817734,\n \"acc_stderr\": 0.03505630140785741,\n \"acc_norm\": 0.458128078817734,\n \"acc_norm_stderr\": 0.03505630140785741\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.8,\n \"acc_stderr\": 0.031234752377721175,\n \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.031234752377721175\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.8282828282828283,\n \"acc_stderr\": 0.026869716187429903,\n \"acc_norm\": 0.8282828282828283,\n \"acc_norm_stderr\": 0.026869716187429903\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8963730569948186,\n \"acc_stderr\": 0.021995311963644234,\n \"acc_norm\": 0.8963730569948186,\n \"acc_norm_stderr\": 0.021995311963644234\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6205128205128205,\n \"acc_stderr\": 0.024603626924097413,\n \"acc_norm\": 0.6205128205128205,\n \"acc_norm_stderr\": 0.024603626924097413\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3925925925925926,\n \"acc_stderr\": 0.029773847012532967,\n \"acc_norm\": 0.3925925925925926,\n \"acc_norm_stderr\": 0.029773847012532967\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.7058823529411765,\n \"acc_stderr\": 0.029597329730978082,\n \"acc_norm\": 0.7058823529411765,\n \"acc_norm_stderr\": 0.029597329730978082\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.37748344370860926,\n \"acc_stderr\": 0.0395802723112157,\n \"acc_norm\": 0.37748344370860926,\n \"acc_norm_stderr\": 0.0395802723112157\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8366972477064221,\n \"acc_stderr\": 0.015848255806501562,\n \"acc_norm\": 0.8366972477064221,\n \"acc_norm_stderr\": 0.015848255806501562\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5555555555555556,\n \"acc_stderr\": 0.03388857118502325,\n \"acc_norm\": 0.5555555555555556,\n \"acc_norm_stderr\": 0.03388857118502325\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8529411764705882,\n \"acc_stderr\": 0.02485747808025046,\n \"acc_norm\": 0.8529411764705882,\n \"acc_norm_stderr\": 0.02485747808025046\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8607594936708861,\n \"acc_stderr\": 0.022535526352692705,\n \"acc_norm\": 0.8607594936708861,\n \"acc_norm_stderr\": 0.022535526352692705\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7085201793721974,\n \"acc_stderr\": 0.03050028317654585,\n \"acc_norm\": 0.7085201793721974,\n \"acc_norm_stderr\": 0.03050028317654585\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7022900763358778,\n \"acc_stderr\": 0.040103589424622034,\n \"acc_norm\": 0.7022900763358778,\n \"acc_norm_stderr\": 0.040103589424622034\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7933884297520661,\n \"acc_stderr\": 0.03695980128098824,\n \"acc_norm\": 0.7933884297520661,\n \"acc_norm_stderr\": 0.03695980128098824\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7962962962962963,\n \"acc_stderr\": 0.03893542518824846,\n \"acc_norm\": 0.7962962962962963,\n \"acc_norm_stderr\": 0.03893542518824846\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7361963190184049,\n \"acc_stderr\": 0.03462419931615623,\n \"acc_norm\": 0.7361963190184049,\n \"acc_norm_stderr\": 0.03462419931615623\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4375,\n \"acc_stderr\": 0.04708567521880525,\n \"acc_norm\": 0.4375,\n \"acc_norm_stderr\": 0.04708567521880525\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8058252427184466,\n \"acc_stderr\": 0.03916667762822584,\n \"acc_norm\": 0.8058252427184466,\n \"acc_norm_stderr\": 0.03916667762822584\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n \"acc_stderr\": 0.021262719400406957,\n \"acc_norm\": 0.8803418803418803,\n \"acc_norm_stderr\": 0.021262719400406957\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8186462324393359,\n \"acc_stderr\": 0.013778693778464078,\n \"acc_norm\": 0.8186462324393359,\n \"acc_norm_stderr\": 0.013778693778464078\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7312138728323699,\n \"acc_stderr\": 0.02386800326250011,\n \"acc_norm\": 0.7312138728323699,\n \"acc_norm_stderr\": 0.02386800326250011\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4,\n \"acc_stderr\": 0.016384638410380823,\n \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.016384638410380823\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7418300653594772,\n \"acc_stderr\": 0.02505850331695814,\n \"acc_norm\": 0.7418300653594772,\n \"acc_norm_stderr\": 0.02505850331695814\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7041800643086816,\n \"acc_stderr\": 0.025922371788818763,\n \"acc_norm\": 0.7041800643086816,\n \"acc_norm_stderr\": 0.025922371788818763\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7407407407407407,\n \"acc_stderr\": 0.02438366553103545,\n \"acc_norm\": 0.7407407407407407,\n \"acc_norm_stderr\": 0.02438366553103545\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.5070921985815603,\n \"acc_stderr\": 0.02982449855912901,\n \"acc_norm\": 0.5070921985815603,\n \"acc_norm_stderr\": 0.02982449855912901\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.485006518904824,\n \"acc_stderr\": 0.01276449320219326,\n \"acc_norm\": 0.485006518904824,\n \"acc_norm_stderr\": 0.01276449320219326\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6911764705882353,\n \"acc_stderr\": 0.028064998167040094,\n \"acc_norm\": 0.6911764705882353,\n \"acc_norm_stderr\": 0.028064998167040094\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6503267973856209,\n \"acc_stderr\": 0.019291961895066375,\n \"acc_norm\": 0.6503267973856209,\n \"acc_norm_stderr\": 0.019291961895066375\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7090909090909091,\n \"acc_stderr\": 0.04350271442923243,\n \"acc_norm\": 0.7090909090909091,\n \"acc_norm_stderr\": 0.04350271442923243\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7591836734693878,\n \"acc_stderr\": 0.02737294220178816,\n \"acc_norm\": 0.7591836734693878,\n \"acc_norm_stderr\": 0.02737294220178816\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n \"acc_stderr\": 0.026193923544454115,\n \"acc_norm\": 0.835820895522388,\n \"acc_norm_stderr\": 0.026193923544454115\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.91,\n \"acc_stderr\": 0.028762349126466108,\n \"acc_norm\": 0.91,\n \"acc_norm_stderr\": 0.028762349126466108\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5662650602409639,\n \"acc_stderr\": 0.03858158940685515,\n \"acc_norm\": 0.5662650602409639,\n \"acc_norm_stderr\": 0.03858158940685515\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8128654970760234,\n \"acc_stderr\": 0.029913127232368053,\n \"acc_norm\": 0.8128654970760234,\n \"acc_norm_stderr\": 0.029913127232368053\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.36474908200734396,\n \"mc1_stderr\": 0.016850961061720116,\n \"mc2\": 0.5238117102691138,\n \"mc2_stderr\": 0.01570708203583901\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8113654301499605,\n \"acc_stderr\": 0.0109951723180198\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5223654283548143,\n \"acc_stderr\": 0.013758699485911838\n }\n}\n```", "repo_url": "https://huggingface.co/yanolja/Bookworm-10.7B-v0.4-DPO", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_01T18_16_15.402421", "path": ["**/details_harness|arc:challenge|25_2024-02-01T18-16-15.402421.parquet"]}, {"split": "2024_02_01T18_19_15.058025", "path": ["**/details_harness|arc:challenge|25_2024-02-01T18-19-15.058025.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-01T18-19-15.058025.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_01T18_16_15.402421", "path": ["**/details_harness|gsm8k|5_2024-02-01T18-16-15.402421.parquet"]}, {"split": "2024_02_01T18_19_15.058025", "path": ["**/details_harness|gsm8k|5_2024-02-01T18-19-15.058025.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-01T18-19-15.058025.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_01T18_16_15.402421", "path": ["**/details_harness|hellaswag|10_2024-02-01T18-16-15.402421.parquet"]}, {"split": "2024_02_01T18_19_15.058025", "path": ["**/details_harness|hellaswag|10_2024-02-01T18-19-15.058025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-01T18-19-15.058025.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_01T18_16_15.402421", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T18-16-15.402421.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-01T18-16-15.402421.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-01T18-16-15.402421.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T18-16-15.402421.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T18-16-15.402421.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-01T18-16-15.402421.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T18-16-15.402421.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T18-16-15.402421.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T18-16-15.402421.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T18-16-15.402421.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-01T18-16-15.402421.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-01T18-16-15.402421.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T18-16-15.402421.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-01T18-16-15.402421.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T18-16-15.402421.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T18-16-15.402421.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T18-16-15.402421.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-01T18-16-15.402421.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T18-16-15.402421.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T18-16-15.402421.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T18-16-15.402421.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T18-16-15.402421.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T18-16-15.402421.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T18-16-15.402421.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T18-16-15.402421.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T18-16-15.402421.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T18-16-15.402421.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T18-16-15.402421.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T18-16-15.402421.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T18-16-15.402421.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T18-16-15.402421.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T18-16-15.402421.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-01T18-16-15.402421.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T18-16-15.402421.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-01T18-16-15.402421.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T18-16-15.402421.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T18-16-15.402421.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T18-16-15.402421.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-01T18-16-15.402421.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-01T18-16-15.402421.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T18-16-15.402421.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T18-16-15.402421.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T18-16-15.402421.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T18-16-15.402421.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-01T18-16-15.402421.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-01T18-16-15.402421.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-01T18-16-15.402421.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T18-16-15.402421.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-01T18-16-15.402421.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T18-16-15.402421.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T18-16-15.402421.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-01T18-16-15.402421.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-01T18-16-15.402421.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-01T18-16-15.402421.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T18-16-15.402421.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-01T18-16-15.402421.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-01T18-16-15.402421.parquet"]}, {"split": "2024_02_01T18_19_15.058025", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T18-19-15.058025.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-01T18-19-15.058025.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-01T18-19-15.058025.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T18-19-15.058025.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T18-19-15.058025.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-01T18-19-15.058025.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T18-19-15.058025.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T18-19-15.058025.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T18-19-15.058025.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T18-19-15.058025.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-01T18-19-15.058025.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-01T18-19-15.058025.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T18-19-15.058025.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-01T18-19-15.058025.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T18-19-15.058025.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T18-19-15.058025.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T18-19-15.058025.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-01T18-19-15.058025.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T18-19-15.058025.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T18-19-15.058025.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T18-19-15.058025.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T18-19-15.058025.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T18-19-15.058025.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T18-19-15.058025.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T18-19-15.058025.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T18-19-15.058025.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T18-19-15.058025.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T18-19-15.058025.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T18-19-15.058025.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T18-19-15.058025.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T18-19-15.058025.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T18-19-15.058025.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-01T18-19-15.058025.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T18-19-15.058025.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-01T18-19-15.058025.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T18-19-15.058025.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T18-19-15.058025.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T18-19-15.058025.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-01T18-19-15.058025.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-01T18-19-15.058025.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T18-19-15.058025.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T18-19-15.058025.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T18-19-15.058025.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T18-19-15.058025.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-01T18-19-15.058025.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-01T18-19-15.058025.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-01T18-19-15.058025.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T18-19-15.058025.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-01T18-19-15.058025.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T18-19-15.058025.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T18-19-15.058025.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-01T18-19-15.058025.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-01T18-19-15.058025.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-01T18-19-15.058025.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T18-19-15.058025.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-01T18-19-15.058025.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-01T18-19-15.058025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T18-19-15.058025.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-01T18-19-15.058025.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-01T18-19-15.058025.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T18-19-15.058025.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T18-19-15.058025.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-01T18-19-15.058025.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T18-19-15.058025.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T18-19-15.058025.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T18-19-15.058025.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T18-19-15.058025.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-01T18-19-15.058025.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-01T18-19-15.058025.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T18-19-15.058025.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-01T18-19-15.058025.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T18-19-15.058025.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T18-19-15.058025.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T18-19-15.058025.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-01T18-19-15.058025.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T18-19-15.058025.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T18-19-15.058025.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T18-19-15.058025.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T18-19-15.058025.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T18-19-15.058025.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T18-19-15.058025.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T18-19-15.058025.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T18-19-15.058025.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T18-19-15.058025.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T18-19-15.058025.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T18-19-15.058025.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T18-19-15.058025.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T18-19-15.058025.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T18-19-15.058025.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-01T18-19-15.058025.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T18-19-15.058025.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-01T18-19-15.058025.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T18-19-15.058025.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T18-19-15.058025.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T18-19-15.058025.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-01T18-19-15.058025.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-01T18-19-15.058025.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T18-19-15.058025.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T18-19-15.058025.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T18-19-15.058025.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T18-19-15.058025.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-01T18-19-15.058025.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-01T18-19-15.058025.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-01T18-19-15.058025.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T18-19-15.058025.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-01T18-19-15.058025.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T18-19-15.058025.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T18-19-15.058025.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-01T18-19-15.058025.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-01T18-19-15.058025.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-01T18-19-15.058025.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T18-19-15.058025.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-01T18-19-15.058025.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-01T18-19-15.058025.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_01T18_16_15.402421", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T18-16-15.402421.parquet"]}, {"split": "2024_02_01T18_19_15.058025", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T18-19-15.058025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T18-19-15.058025.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_01T18_16_15.402421", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-01T18-16-15.402421.parquet"]}, {"split": "2024_02_01T18_19_15.058025", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-01T18-19-15.058025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-01T18-19-15.058025.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_01T18_16_15.402421", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-01T18-16-15.402421.parquet"]}, {"split": "2024_02_01T18_19_15.058025", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-01T18-19-15.058025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-01T18-19-15.058025.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_01T18_16_15.402421", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T18-16-15.402421.parquet"]}, {"split": "2024_02_01T18_19_15.058025", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T18-19-15.058025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T18-19-15.058025.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_01T18_16_15.402421", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T18-16-15.402421.parquet"]}, {"split": "2024_02_01T18_19_15.058025", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T18-19-15.058025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T18-19-15.058025.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_01T18_16_15.402421", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-01T18-16-15.402421.parquet"]}, {"split": "2024_02_01T18_19_15.058025", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-01T18-19-15.058025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-01T18-19-15.058025.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_01T18_16_15.402421", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T18-16-15.402421.parquet"]}, {"split": "2024_02_01T18_19_15.058025", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T18-19-15.058025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T18-19-15.058025.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_01T18_16_15.402421", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T18-16-15.402421.parquet"]}, {"split": "2024_02_01T18_19_15.058025", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T18-19-15.058025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T18-19-15.058025.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_01T18_16_15.402421", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T18-16-15.402421.parquet"]}, {"split": "2024_02_01T18_19_15.058025", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T18-19-15.058025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T18-19-15.058025.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_01T18_16_15.402421", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T18-16-15.402421.parquet"]}, {"split": "2024_02_01T18_19_15.058025", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T18-19-15.058025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T18-19-15.058025.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_01T18_16_15.402421", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-01T18-16-15.402421.parquet"]}, {"split": "2024_02_01T18_19_15.058025", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-01T18-19-15.058025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-01T18-19-15.058025.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_01T18_16_15.402421", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-01T18-16-15.402421.parquet"]}, {"split": "2024_02_01T18_19_15.058025", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-01T18-19-15.058025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-01T18-19-15.058025.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_01T18_16_15.402421", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T18-16-15.402421.parquet"]}, {"split": "2024_02_01T18_19_15.058025", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T18-19-15.058025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T18-19-15.058025.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_01T18_16_15.402421", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-01T18-16-15.402421.parquet"]}, {"split": "2024_02_01T18_19_15.058025", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-01T18-19-15.058025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-01T18-19-15.058025.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_01T18_16_15.402421", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T18-16-15.402421.parquet"]}, {"split": "2024_02_01T18_19_15.058025", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T18-19-15.058025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T18-19-15.058025.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_01T18_16_15.402421", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T18-16-15.402421.parquet"]}, {"split": "2024_02_01T18_19_15.058025", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T18-19-15.058025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T18-19-15.058025.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_01T18_16_15.402421", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T18-16-15.402421.parquet"]}, {"split": "2024_02_01T18_19_15.058025", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T18-19-15.058025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T18-19-15.058025.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_01T18_16_15.402421", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-01T18-16-15.402421.parquet"]}, {"split": "2024_02_01T18_19_15.058025", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-01T18-19-15.058025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-01T18-19-15.058025.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_01T18_16_15.402421", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T18-16-15.402421.parquet"]}, {"split": "2024_02_01T18_19_15.058025", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T18-19-15.058025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T18-19-15.058025.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_01T18_16_15.402421", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T18-16-15.402421.parquet"]}, {"split": "2024_02_01T18_19_15.058025", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T18-19-15.058025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T18-19-15.058025.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_01T18_16_15.402421", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T18-16-15.402421.parquet"]}, {"split": "2024_02_01T18_19_15.058025", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T18-19-15.058025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T18-19-15.058025.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_01T18_16_15.402421", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T18-16-15.402421.parquet"]}, {"split": "2024_02_01T18_19_15.058025", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T18-19-15.058025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T18-19-15.058025.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_01T18_16_15.402421", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T18-16-15.402421.parquet"]}, {"split": "2024_02_01T18_19_15.058025", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T18-19-15.058025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T18-19-15.058025.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_01T18_16_15.402421", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T18-16-15.402421.parquet"]}, {"split": "2024_02_01T18_19_15.058025", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T18-19-15.058025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T18-19-15.058025.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_01T18_16_15.402421", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T18-16-15.402421.parquet"]}, {"split": "2024_02_01T18_19_15.058025", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T18-19-15.058025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T18-19-15.058025.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_01T18_16_15.402421", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T18-16-15.402421.parquet"]}, {"split": "2024_02_01T18_19_15.058025", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T18-19-15.058025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T18-19-15.058025.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_01T18_16_15.402421", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T18-16-15.402421.parquet"]}, {"split": "2024_02_01T18_19_15.058025", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T18-19-15.058025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T18-19-15.058025.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_01T18_16_15.402421", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T18-16-15.402421.parquet"]}, {"split": "2024_02_01T18_19_15.058025", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T18-19-15.058025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T18-19-15.058025.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_01T18_16_15.402421", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T18-16-15.402421.parquet"]}, {"split": "2024_02_01T18_19_15.058025", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T18-19-15.058025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T18-19-15.058025.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_01T18_16_15.402421", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T18-16-15.402421.parquet"]}, {"split": "2024_02_01T18_19_15.058025", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T18-19-15.058025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T18-19-15.058025.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_01T18_16_15.402421", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T18-16-15.402421.parquet"]}, {"split": "2024_02_01T18_19_15.058025", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T18-19-15.058025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T18-19-15.058025.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_01T18_16_15.402421", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T18-16-15.402421.parquet"]}, {"split": "2024_02_01T18_19_15.058025", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T18-19-15.058025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T18-19-15.058025.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_01T18_16_15.402421", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-01T18-16-15.402421.parquet"]}, {"split": "2024_02_01T18_19_15.058025", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-01T18-19-15.058025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-01T18-19-15.058025.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_01T18_16_15.402421", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T18-16-15.402421.parquet"]}, {"split": "2024_02_01T18_19_15.058025", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T18-19-15.058025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T18-19-15.058025.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_01T18_16_15.402421", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-01T18-16-15.402421.parquet"]}, {"split": "2024_02_01T18_19_15.058025", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-01T18-19-15.058025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-01T18-19-15.058025.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_01T18_16_15.402421", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T18-16-15.402421.parquet"]}, {"split": "2024_02_01T18_19_15.058025", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T18-19-15.058025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T18-19-15.058025.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_01T18_16_15.402421", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T18-16-15.402421.parquet"]}, {"split": "2024_02_01T18_19_15.058025", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T18-19-15.058025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T18-19-15.058025.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_01T18_16_15.402421", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T18-16-15.402421.parquet"]}, {"split": "2024_02_01T18_19_15.058025", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T18-19-15.058025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T18-19-15.058025.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_01T18_16_15.402421", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-01T18-16-15.402421.parquet"]}, {"split": "2024_02_01T18_19_15.058025", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-01T18-19-15.058025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-01T18-19-15.058025.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_01T18_16_15.402421", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-01T18-16-15.402421.parquet"]}, {"split": "2024_02_01T18_19_15.058025", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-01T18-19-15.058025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-01T18-19-15.058025.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_01T18_16_15.402421", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T18-16-15.402421.parquet"]}, {"split": "2024_02_01T18_19_15.058025", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T18-19-15.058025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T18-19-15.058025.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_01T18_16_15.402421", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T18-16-15.402421.parquet"]}, {"split": "2024_02_01T18_19_15.058025", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T18-19-15.058025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T18-19-15.058025.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_01T18_16_15.402421", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T18-16-15.402421.parquet"]}, {"split": "2024_02_01T18_19_15.058025", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T18-19-15.058025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T18-19-15.058025.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_01T18_16_15.402421", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T18-16-15.402421.parquet"]}, {"split": "2024_02_01T18_19_15.058025", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T18-19-15.058025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T18-19-15.058025.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_01T18_16_15.402421", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-01T18-16-15.402421.parquet"]}, {"split": "2024_02_01T18_19_15.058025", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-01T18-19-15.058025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-01T18-19-15.058025.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_01T18_16_15.402421", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-01T18-16-15.402421.parquet"]}, {"split": "2024_02_01T18_19_15.058025", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-01T18-19-15.058025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-01T18-19-15.058025.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_01T18_16_15.402421", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-01T18-16-15.402421.parquet"]}, {"split": "2024_02_01T18_19_15.058025", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-01T18-19-15.058025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-01T18-19-15.058025.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_01T18_16_15.402421", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T18-16-15.402421.parquet"]}, {"split": "2024_02_01T18_19_15.058025", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T18-19-15.058025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T18-19-15.058025.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_01T18_16_15.402421", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-01T18-16-15.402421.parquet"]}, {"split": "2024_02_01T18_19_15.058025", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-01T18-19-15.058025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-01T18-19-15.058025.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_01T18_16_15.402421", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T18-16-15.402421.parquet"]}, {"split": "2024_02_01T18_19_15.058025", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T18-19-15.058025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T18-19-15.058025.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_01T18_16_15.402421", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T18-16-15.402421.parquet"]}, {"split": "2024_02_01T18_19_15.058025", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T18-19-15.058025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T18-19-15.058025.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_01T18_16_15.402421", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-01T18-16-15.402421.parquet"]}, {"split": "2024_02_01T18_19_15.058025", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-01T18-19-15.058025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-01T18-19-15.058025.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_01T18_16_15.402421", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-01T18-16-15.402421.parquet"]}, {"split": "2024_02_01T18_19_15.058025", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-01T18-19-15.058025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-01T18-19-15.058025.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_01T18_16_15.402421", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-01T18-16-15.402421.parquet"]}, {"split": "2024_02_01T18_19_15.058025", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-01T18-19-15.058025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-01T18-19-15.058025.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_01T18_16_15.402421", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T18-16-15.402421.parquet"]}, {"split": "2024_02_01T18_19_15.058025", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T18-19-15.058025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T18-19-15.058025.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_01T18_16_15.402421", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-01T18-16-15.402421.parquet"]}, {"split": "2024_02_01T18_19_15.058025", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-01T18-19-15.058025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-01T18-19-15.058025.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_01T18_16_15.402421", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-01T18-16-15.402421.parquet"]}, {"split": "2024_02_01T18_19_15.058025", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-01T18-19-15.058025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-01T18-19-15.058025.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_01T18_16_15.402421", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-01T18-16-15.402421.parquet"]}, {"split": "2024_02_01T18_19_15.058025", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-01T18-19-15.058025.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-01T18-19-15.058025.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_01T18_16_15.402421", "path": ["**/details_harness|winogrande|5_2024-02-01T18-16-15.402421.parquet"]}, {"split": "2024_02_01T18_19_15.058025", "path": ["**/details_harness|winogrande|5_2024-02-01T18-19-15.058025.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-01T18-19-15.058025.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_01T18_16_15.402421", "path": ["results_2024-02-01T18-16-15.402421.parquet"]}, {"split": "2024_02_01T18_19_15.058025", "path": ["results_2024-02-01T18-19-15.058025.parquet"]}, {"split": "latest", "path": ["results_2024-02-01T18-19-15.058025.parquet"]}]}]}
2024-02-01T18:21:58+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of yanolja/Bookworm-10.7B-v0.4-DPO Dataset automatically created during the evaluation run of model yanolja/Bookworm-10.7B-v0.4-DPO on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-02-01T18:19:15.058025(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of yanolja/Bookworm-10.7B-v0.4-DPO\n\n\n\nDataset automatically created during the evaluation run of model yanolja/Bookworm-10.7B-v0.4-DPO on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-02-01T18:19:15.058025(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of yanolja/Bookworm-10.7B-v0.4-DPO\n\n\n\nDataset automatically created during the evaluation run of model yanolja/Bookworm-10.7B-v0.4-DPO on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-02-01T18:19:15.058025(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
73c643c68bc02a9debe487b01cd95159fbce2cc5
# Dataset Card for "snips_test_valid_unit" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
Codec-SUPERB/snips_test_valid_unit
[ "region:us" ]
2024-02-01T18:28:07+00:00
{"configs": [{"config_name": "default", "data_files": [{"split": "academicodec_hifi_16k_320d", "path": "data/academicodec_hifi_16k_320d-*"}, {"split": "academicodec_hifi_16k_320d_large_uni", "path": "data/academicodec_hifi_16k_320d_large_uni-*"}, {"split": "academicodec_hifi_24k_320d", "path": "data/academicodec_hifi_24k_320d-*"}, {"split": "audiodec_24k_320d", "path": "data/audiodec_24k_320d-*"}, {"split": "dac_16k", "path": "data/dac_16k-*"}, {"split": "dac_24k", "path": "data/dac_24k-*"}, {"split": "dac_44k", "path": "data/dac_44k-*"}, {"split": "encodec_24k_12bps", "path": "data/encodec_24k_12bps-*"}, {"split": "encodec_24k_1_5bps", "path": "data/encodec_24k_1_5bps-*"}, {"split": "encodec_24k_24bps", "path": "data/encodec_24k_24bps-*"}, {"split": "encodec_24k_3bps", "path": "data/encodec_24k_3bps-*"}, {"split": "encodec_24k_6bps", "path": "data/encodec_24k_6bps-*"}, {"split": "funcodec_en_libritts_16k_gr1nq32ds320", "path": "data/funcodec_en_libritts_16k_gr1nq32ds320-*"}, {"split": "funcodec_en_libritts_16k_gr8nq32ds320", "path": "data/funcodec_en_libritts_16k_gr8nq32ds320-*"}, {"split": "funcodec_en_libritts_16k_nq32ds320", "path": "data/funcodec_en_libritts_16k_nq32ds320-*"}, {"split": "funcodec_en_libritts_16k_nq32ds640", "path": "data/funcodec_en_libritts_16k_nq32ds640-*"}, {"split": "funcodec_zh_en_16k_nq32ds320", "path": "data/funcodec_zh_en_16k_nq32ds320-*"}, {"split": "funcodec_zh_en_16k_nq32ds640", "path": "data/funcodec_zh_en_16k_nq32ds640-*"}, {"split": "speech_tokenizer_16k", "path": "data/speech_tokenizer_16k-*"}]}], "dataset_info": {"features": [{"name": "id", "dtype": "string"}, {"name": "unit", "sequence": {"sequence": "int64"}}], "splits": [{"name": "academicodec_hifi_16k_320d", "num_bytes": 104890744, "num_examples": 22400}, {"name": "academicodec_hifi_16k_320d_large_uni", "num_bytes": 104890744, "num_examples": 22400}, {"name": "academicodec_hifi_24k_320d", "num_bytes": 156911640, "num_examples": 22400}, {"name": "audiodec_24k_320d", "num_bytes": 335215352, "num_examples": 22400}, {"name": "dac_16k", "num_bytes": 329541496, "num_examples": 22400}, {"name": "dac_24k", "num_bytes": 1316239608, "num_examples": 22400}, {"name": "dac_44k", "num_bytes": 425937832, "num_examples": 22400}, {"name": "encodec_24k_12bps", "num_bytes": 627940216, "num_examples": 22400}, {"name": "encodec_24k_1_5bps", "num_bytes": 79225672, "num_examples": 22400}, {"name": "encodec_24k_24bps", "num_bytes": 1255042552, "num_examples": 22400}, {"name": "encodec_24k_3bps", "num_bytes": 157613464, "num_examples": 22400}, {"name": "encodec_24k_6bps", "num_bytes": 314389048, "num_examples": 22400}, {"name": "funcodec_en_libritts_16k_gr1nq32ds320", "num_bytes": 838995192, "num_examples": 22400}, {"name": "funcodec_en_libritts_16k_gr8nq32ds320", "num_bytes": 838995192, "num_examples": 22400}, {"name": "funcodec_en_libritts_16k_nq32ds320", "num_bytes": 838875384, "num_examples": 22400}, {"name": "funcodec_en_libritts_16k_nq32ds640", "num_bytes": 422686712, "num_examples": 22400}, {"name": "funcodec_zh_en_16k_nq32ds320", "num_bytes": 838875384, "num_examples": 22400}, {"name": "funcodec_zh_en_16k_nq32ds640", "num_bytes": 422686712, "num_examples": 22400}, {"name": "speech_tokenizer_16k", "num_bytes": 210347256, "num_examples": 22400}], "download_size": 1509323889, "dataset_size": 9619300200}}
2024-02-01T19:10:48+00:00
[]
[]
TAGS #region-us
# Dataset Card for "snips_test_valid_unit" More Information needed
[ "# Dataset Card for \"snips_test_valid_unit\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"snips_test_valid_unit\"\n\nMore Information needed" ]
ea2517baf384464d028122e419b414924fb1f7e5
# Dataset Card for Evaluation run of DreadPoor/ToppyEvil-7B-slerp <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [DreadPoor/ToppyEvil-7B-slerp](https://huggingface.co/DreadPoor/ToppyEvil-7B-slerp) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_DreadPoor__ToppyEvil-7B-slerp", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-02-01T18:32:26.393415](https://huggingface.co/datasets/open-llm-leaderboard/details_DreadPoor__ToppyEvil-7B-slerp/blob/main/results_2024-02-01T18-32-26.393415.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6371495869198338, "acc_stderr": 0.03235698893021149, "acc_norm": 0.6394933656367955, "acc_norm_stderr": 0.03300210838103083, "mc1": 0.3157894736842105, "mc1_stderr": 0.016272287957916916, "mc2": 0.4605602132329784, "mc2_stderr": 0.014970488994464466 }, "harness|arc:challenge|25": { "acc": 0.6100682593856656, "acc_stderr": 0.014252959848892893, "acc_norm": 0.636518771331058, "acc_norm_stderr": 0.014056207319068283 }, "harness|hellaswag|10": { "acc": 0.6714797849034057, "acc_stderr": 0.00468715199479107, "acc_norm": 0.8428599880501892, "acc_norm_stderr": 0.00363188949612254 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.39, "acc_stderr": 0.04902071300001975, "acc_norm": 0.39, "acc_norm_stderr": 0.04902071300001975 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.562962962962963, "acc_stderr": 0.042849586397534015, "acc_norm": 0.562962962962963, "acc_norm_stderr": 0.042849586397534015 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.6842105263157895, "acc_stderr": 0.03782728980865469, "acc_norm": 0.6842105263157895, "acc_norm_stderr": 0.03782728980865469 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.58, "acc_stderr": 0.049604496374885836, "acc_norm": 0.58, "acc_norm_stderr": 0.049604496374885836 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.6792452830188679, "acc_stderr": 0.028727502957880267, "acc_norm": 0.6792452830188679, "acc_norm_stderr": 0.028727502957880267 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.7361111111111112, "acc_stderr": 0.03685651095897532, "acc_norm": 0.7361111111111112, "acc_norm_stderr": 0.03685651095897532 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.46, "acc_stderr": 0.05009082659620332, "acc_norm": 0.46, "acc_norm_stderr": 0.05009082659620332 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.51, "acc_stderr": 0.05024183937956911, "acc_norm": 0.51, "acc_norm_stderr": 0.05024183937956911 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.3, "acc_stderr": 0.046056618647183814, "acc_norm": 0.3, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6705202312138728, "acc_stderr": 0.03583901754736412, "acc_norm": 0.6705202312138728, "acc_norm_stderr": 0.03583901754736412 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.37254901960784315, "acc_stderr": 0.04810840148082635, "acc_norm": 0.37254901960784315, "acc_norm_stderr": 0.04810840148082635 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.77, "acc_stderr": 0.04229525846816508, "acc_norm": 0.77, "acc_norm_stderr": 0.04229525846816508 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.5361702127659574, "acc_stderr": 0.03260038511835771, "acc_norm": 0.5361702127659574, "acc_norm_stderr": 0.03260038511835771 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.5087719298245614, "acc_stderr": 0.04702880432049615, "acc_norm": 0.5087719298245614, "acc_norm_stderr": 0.04702880432049615 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5517241379310345, "acc_stderr": 0.04144311810878152, "acc_norm": 0.5517241379310345, "acc_norm_stderr": 0.04144311810878152 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.43915343915343913, "acc_stderr": 0.025559920550531003, "acc_norm": 0.43915343915343913, "acc_norm_stderr": 0.025559920550531003 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.4365079365079365, "acc_stderr": 0.04435932892851466, "acc_norm": 0.4365079365079365, "acc_norm_stderr": 0.04435932892851466 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.37, "acc_stderr": 0.04852365870939099, "acc_norm": 0.37, "acc_norm_stderr": 0.04852365870939099 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.7645161290322581, "acc_stderr": 0.02413763242933771, "acc_norm": 0.7645161290322581, "acc_norm_stderr": 0.02413763242933771 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.4876847290640394, "acc_stderr": 0.035169204442208966, "acc_norm": 0.4876847290640394, "acc_norm_stderr": 0.035169204442208966 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.68, "acc_stderr": 0.04688261722621505, "acc_norm": 0.68, "acc_norm_stderr": 0.04688261722621505 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7515151515151515, "acc_stderr": 0.033744026441394036, "acc_norm": 0.7515151515151515, "acc_norm_stderr": 0.033744026441394036 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.7929292929292929, "acc_stderr": 0.028869778460267045, "acc_norm": 0.7929292929292929, "acc_norm_stderr": 0.028869778460267045 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.8652849740932642, "acc_stderr": 0.024639789097709443, "acc_norm": 0.8652849740932642, "acc_norm_stderr": 0.024639789097709443 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.6410256410256411, "acc_stderr": 0.02432173848460235, "acc_norm": 0.6410256410256411, "acc_norm_stderr": 0.02432173848460235 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.3296296296296296, "acc_stderr": 0.028661201116524572, "acc_norm": 0.3296296296296296, "acc_norm_stderr": 0.028661201116524572 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.6974789915966386, "acc_stderr": 0.02983796238829194, "acc_norm": 0.6974789915966386, "acc_norm_stderr": 0.02983796238829194 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.40397350993377484, "acc_stderr": 0.04006485685365342, "acc_norm": 0.40397350993377484, "acc_norm_stderr": 0.04006485685365342 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8348623853211009, "acc_stderr": 0.015919557829976044, "acc_norm": 0.8348623853211009, "acc_norm_stderr": 0.015919557829976044 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.5046296296296297, "acc_stderr": 0.03409825519163572, "acc_norm": 0.5046296296296297, "acc_norm_stderr": 0.03409825519163572 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.8235294117647058, "acc_stderr": 0.026756401538078966, "acc_norm": 0.8235294117647058, "acc_norm_stderr": 0.026756401538078966 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.7721518987341772, "acc_stderr": 0.027303484599069422, "acc_norm": 0.7721518987341772, "acc_norm_stderr": 0.027303484599069422 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6816143497757847, "acc_stderr": 0.03126580522513713, "acc_norm": 0.6816143497757847, "acc_norm_stderr": 0.03126580522513713 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.7862595419847328, "acc_stderr": 0.0359546161177469, "acc_norm": 0.7862595419847328, "acc_norm_stderr": 0.0359546161177469 }, "harness|hendrycksTest-international_law|5": { "acc": 0.768595041322314, "acc_stderr": 0.03849856098794088, "acc_norm": 0.768595041322314, "acc_norm_stderr": 0.03849856098794088 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7777777777777778, "acc_stderr": 0.0401910747255735, "acc_norm": 0.7777777777777778, "acc_norm_stderr": 0.0401910747255735 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7852760736196319, "acc_stderr": 0.03226219377286775, "acc_norm": 0.7852760736196319, "acc_norm_stderr": 0.03226219377286775 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.49107142857142855, "acc_stderr": 0.04745033255489123, "acc_norm": 0.49107142857142855, "acc_norm_stderr": 0.04745033255489123 }, "harness|hendrycksTest-management|5": { "acc": 0.8155339805825242, "acc_stderr": 0.03840423627288276, "acc_norm": 0.8155339805825242, "acc_norm_stderr": 0.03840423627288276 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8846153846153846, "acc_stderr": 0.02093019318517933, "acc_norm": 0.8846153846153846, "acc_norm_stderr": 0.02093019318517933 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.72, "acc_stderr": 0.045126085985421276, "acc_norm": 0.72, "acc_norm_stderr": 0.045126085985421276 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8301404853128991, "acc_stderr": 0.013428186370608318, "acc_norm": 0.8301404853128991, "acc_norm_stderr": 0.013428186370608318 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.7312138728323699, "acc_stderr": 0.023868003262500107, "acc_norm": 0.7312138728323699, "acc_norm_stderr": 0.023868003262500107 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.25921787709497207, "acc_stderr": 0.014655780837497724, "acc_norm": 0.25921787709497207, "acc_norm_stderr": 0.014655780837497724 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.7222222222222222, "acc_stderr": 0.0256468630971379, "acc_norm": 0.7222222222222222, "acc_norm_stderr": 0.0256468630971379 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.684887459807074, "acc_stderr": 0.02638527370346449, "acc_norm": 0.684887459807074, "acc_norm_stderr": 0.02638527370346449 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.7098765432098766, "acc_stderr": 0.025251173936495036, "acc_norm": 0.7098765432098766, "acc_norm_stderr": 0.025251173936495036 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.4858156028368794, "acc_stderr": 0.02981549448368206, "acc_norm": 0.4858156028368794, "acc_norm_stderr": 0.02981549448368206 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.45371577574967403, "acc_stderr": 0.012715404841277738, "acc_norm": 0.45371577574967403, "acc_norm_stderr": 0.012715404841277738 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.6838235294117647, "acc_stderr": 0.028245687391462937, "acc_norm": 0.6838235294117647, "acc_norm_stderr": 0.028245687391462937 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6519607843137255, "acc_stderr": 0.019270998708223977, "acc_norm": 0.6519607843137255, "acc_norm_stderr": 0.019270998708223977 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6545454545454545, "acc_stderr": 0.04554619617541054, "acc_norm": 0.6545454545454545, "acc_norm_stderr": 0.04554619617541054 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.7346938775510204, "acc_stderr": 0.028263889943784586, "acc_norm": 0.7346938775510204, "acc_norm_stderr": 0.028263889943784586 }, "harness|hendrycksTest-sociology|5": { "acc": 0.835820895522388, "acc_stderr": 0.026193923544454132, "acc_norm": 0.835820895522388, "acc_norm_stderr": 0.026193923544454132 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.87, "acc_stderr": 0.033799766898963086, "acc_norm": 0.87, "acc_norm_stderr": 0.033799766898963086 }, "harness|hendrycksTest-virology|5": { "acc": 0.5120481927710844, "acc_stderr": 0.03891364495835817, "acc_norm": 0.5120481927710844, "acc_norm_stderr": 0.03891364495835817 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8128654970760234, "acc_stderr": 0.02991312723236804, "acc_norm": 0.8128654970760234, "acc_norm_stderr": 0.02991312723236804 }, "harness|truthfulqa:mc|0": { "mc1": 0.3157894736842105, "mc1_stderr": 0.016272287957916916, "mc2": 0.4605602132329784, "mc2_stderr": 0.014970488994464466 }, "harness|winogrande|5": { "acc": 0.7758484609313339, "acc_stderr": 0.011720400740774092 }, "harness|gsm8k|5": { "acc": 0.5579984836997726, "acc_stderr": 0.01367951449281457 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_DreadPoor__ToppyEvil-7B-slerp
[ "region:us" ]
2024-02-01T18:34:44+00:00
{"pretty_name": "Evaluation run of DreadPoor/ToppyEvil-7B-slerp", "dataset_summary": "Dataset automatically created during the evaluation run of model [DreadPoor/ToppyEvil-7B-slerp](https://huggingface.co/DreadPoor/ToppyEvil-7B-slerp) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_DreadPoor__ToppyEvil-7B-slerp\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-01T18:32:26.393415](https://huggingface.co/datasets/open-llm-leaderboard/details_DreadPoor__ToppyEvil-7B-slerp/blob/main/results_2024-02-01T18-32-26.393415.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6371495869198338,\n \"acc_stderr\": 0.03235698893021149,\n \"acc_norm\": 0.6394933656367955,\n \"acc_norm_stderr\": 0.03300210838103083,\n \"mc1\": 0.3157894736842105,\n \"mc1_stderr\": 0.016272287957916916,\n \"mc2\": 0.4605602132329784,\n \"mc2_stderr\": 0.014970488994464466\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6100682593856656,\n \"acc_stderr\": 0.014252959848892893,\n \"acc_norm\": 0.636518771331058,\n \"acc_norm_stderr\": 0.014056207319068283\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6714797849034057,\n \"acc_stderr\": 0.00468715199479107,\n \"acc_norm\": 0.8428599880501892,\n \"acc_norm_stderr\": 0.00363188949612254\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.562962962962963,\n \"acc_stderr\": 0.042849586397534015,\n \"acc_norm\": 0.562962962962963,\n \"acc_norm_stderr\": 0.042849586397534015\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6842105263157895,\n \"acc_stderr\": 0.03782728980865469,\n \"acc_norm\": 0.6842105263157895,\n \"acc_norm_stderr\": 0.03782728980865469\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6792452830188679,\n \"acc_stderr\": 0.028727502957880267,\n \"acc_norm\": 0.6792452830188679,\n \"acc_norm_stderr\": 0.028727502957880267\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7361111111111112,\n \"acc_stderr\": 0.03685651095897532,\n \"acc_norm\": 0.7361111111111112,\n \"acc_norm_stderr\": 0.03685651095897532\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956911,\n \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956911\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6705202312138728,\n \"acc_stderr\": 0.03583901754736412,\n \"acc_norm\": 0.6705202312138728,\n \"acc_norm_stderr\": 0.03583901754736412\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.37254901960784315,\n \"acc_stderr\": 0.04810840148082635,\n \"acc_norm\": 0.37254901960784315,\n \"acc_norm_stderr\": 0.04810840148082635\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.77,\n \"acc_stderr\": 0.04229525846816508,\n \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.04229525846816508\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5361702127659574,\n \"acc_stderr\": 0.03260038511835771,\n \"acc_norm\": 0.5361702127659574,\n \"acc_norm_stderr\": 0.03260038511835771\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5087719298245614,\n \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.5087719298245614,\n \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5517241379310345,\n \"acc_stderr\": 0.04144311810878152,\n \"acc_norm\": 0.5517241379310345,\n \"acc_norm_stderr\": 0.04144311810878152\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.43915343915343913,\n \"acc_stderr\": 0.025559920550531003,\n \"acc_norm\": 0.43915343915343913,\n \"acc_norm_stderr\": 0.025559920550531003\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4365079365079365,\n \"acc_stderr\": 0.04435932892851466,\n \"acc_norm\": 0.4365079365079365,\n \"acc_norm_stderr\": 0.04435932892851466\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7645161290322581,\n \"acc_stderr\": 0.02413763242933771,\n \"acc_norm\": 0.7645161290322581,\n \"acc_norm_stderr\": 0.02413763242933771\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4876847290640394,\n \"acc_stderr\": 0.035169204442208966,\n \"acc_norm\": 0.4876847290640394,\n \"acc_norm_stderr\": 0.035169204442208966\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7515151515151515,\n \"acc_stderr\": 0.033744026441394036,\n \"acc_norm\": 0.7515151515151515,\n \"acc_norm_stderr\": 0.033744026441394036\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7929292929292929,\n \"acc_stderr\": 0.028869778460267045,\n \"acc_norm\": 0.7929292929292929,\n \"acc_norm_stderr\": 0.028869778460267045\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8652849740932642,\n \"acc_stderr\": 0.024639789097709443,\n \"acc_norm\": 0.8652849740932642,\n \"acc_norm_stderr\": 0.024639789097709443\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6410256410256411,\n \"acc_stderr\": 0.02432173848460235,\n \"acc_norm\": 0.6410256410256411,\n \"acc_norm_stderr\": 0.02432173848460235\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3296296296296296,\n \"acc_stderr\": 0.028661201116524572,\n \"acc_norm\": 0.3296296296296296,\n \"acc_norm_stderr\": 0.028661201116524572\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6974789915966386,\n \"acc_stderr\": 0.02983796238829194,\n \"acc_norm\": 0.6974789915966386,\n \"acc_norm_stderr\": 0.02983796238829194\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.40397350993377484,\n \"acc_stderr\": 0.04006485685365342,\n \"acc_norm\": 0.40397350993377484,\n \"acc_norm_stderr\": 0.04006485685365342\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8348623853211009,\n \"acc_stderr\": 0.015919557829976044,\n \"acc_norm\": 0.8348623853211009,\n \"acc_norm_stderr\": 0.015919557829976044\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5046296296296297,\n \"acc_stderr\": 0.03409825519163572,\n \"acc_norm\": 0.5046296296296297,\n \"acc_norm_stderr\": 0.03409825519163572\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8235294117647058,\n \"acc_stderr\": 0.026756401538078966,\n \"acc_norm\": 0.8235294117647058,\n \"acc_norm_stderr\": 0.026756401538078966\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7721518987341772,\n \"acc_stderr\": 0.027303484599069422,\n \"acc_norm\": 0.7721518987341772,\n \"acc_norm_stderr\": 0.027303484599069422\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6816143497757847,\n \"acc_stderr\": 0.03126580522513713,\n \"acc_norm\": 0.6816143497757847,\n \"acc_norm_stderr\": 0.03126580522513713\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7862595419847328,\n \"acc_stderr\": 0.0359546161177469,\n \"acc_norm\": 0.7862595419847328,\n \"acc_norm_stderr\": 0.0359546161177469\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.768595041322314,\n \"acc_stderr\": 0.03849856098794088,\n \"acc_norm\": 0.768595041322314,\n \"acc_norm_stderr\": 0.03849856098794088\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.0401910747255735,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.0401910747255735\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7852760736196319,\n \"acc_stderr\": 0.03226219377286775,\n \"acc_norm\": 0.7852760736196319,\n \"acc_norm_stderr\": 0.03226219377286775\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.49107142857142855,\n \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.49107142857142855,\n \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8155339805825242,\n \"acc_stderr\": 0.03840423627288276,\n \"acc_norm\": 0.8155339805825242,\n \"acc_norm_stderr\": 0.03840423627288276\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8846153846153846,\n \"acc_stderr\": 0.02093019318517933,\n \"acc_norm\": 0.8846153846153846,\n \"acc_norm_stderr\": 0.02093019318517933\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8301404853128991,\n \"acc_stderr\": 0.013428186370608318,\n \"acc_norm\": 0.8301404853128991,\n \"acc_norm_stderr\": 0.013428186370608318\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7312138728323699,\n \"acc_stderr\": 0.023868003262500107,\n \"acc_norm\": 0.7312138728323699,\n \"acc_norm_stderr\": 0.023868003262500107\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.25921787709497207,\n \"acc_stderr\": 0.014655780837497724,\n \"acc_norm\": 0.25921787709497207,\n \"acc_norm_stderr\": 0.014655780837497724\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.0256468630971379,\n \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.0256468630971379\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.684887459807074,\n \"acc_stderr\": 0.02638527370346449,\n \"acc_norm\": 0.684887459807074,\n \"acc_norm_stderr\": 0.02638527370346449\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7098765432098766,\n \"acc_stderr\": 0.025251173936495036,\n \"acc_norm\": 0.7098765432098766,\n \"acc_norm_stderr\": 0.025251173936495036\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4858156028368794,\n \"acc_stderr\": 0.02981549448368206,\n \"acc_norm\": 0.4858156028368794,\n \"acc_norm_stderr\": 0.02981549448368206\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.45371577574967403,\n \"acc_stderr\": 0.012715404841277738,\n \"acc_norm\": 0.45371577574967403,\n \"acc_norm_stderr\": 0.012715404841277738\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6838235294117647,\n \"acc_stderr\": 0.028245687391462937,\n \"acc_norm\": 0.6838235294117647,\n \"acc_norm_stderr\": 0.028245687391462937\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6519607843137255,\n \"acc_stderr\": 0.019270998708223977,\n \"acc_norm\": 0.6519607843137255,\n \"acc_norm_stderr\": 0.019270998708223977\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7346938775510204,\n \"acc_stderr\": 0.028263889943784586,\n \"acc_norm\": 0.7346938775510204,\n \"acc_norm_stderr\": 0.028263889943784586\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n \"acc_stderr\": 0.026193923544454132,\n \"acc_norm\": 0.835820895522388,\n \"acc_norm_stderr\": 0.026193923544454132\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.87,\n \"acc_stderr\": 0.033799766898963086,\n \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.033799766898963086\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5120481927710844,\n \"acc_stderr\": 0.03891364495835817,\n \"acc_norm\": 0.5120481927710844,\n \"acc_norm_stderr\": 0.03891364495835817\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8128654970760234,\n \"acc_stderr\": 0.02991312723236804,\n \"acc_norm\": 0.8128654970760234,\n \"acc_norm_stderr\": 0.02991312723236804\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3157894736842105,\n \"mc1_stderr\": 0.016272287957916916,\n \"mc2\": 0.4605602132329784,\n \"mc2_stderr\": 0.014970488994464466\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7758484609313339,\n \"acc_stderr\": 0.011720400740774092\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5579984836997726,\n \"acc_stderr\": 0.01367951449281457\n }\n}\n```", "repo_url": "https://huggingface.co/DreadPoor/ToppyEvil-7B-slerp", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_01T18_32_26.393415", "path": ["**/details_harness|arc:challenge|25_2024-02-01T18-32-26.393415.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-01T18-32-26.393415.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_01T18_32_26.393415", "path": ["**/details_harness|gsm8k|5_2024-02-01T18-32-26.393415.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-01T18-32-26.393415.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_01T18_32_26.393415", "path": ["**/details_harness|hellaswag|10_2024-02-01T18-32-26.393415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-01T18-32-26.393415.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_01T18_32_26.393415", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T18-32-26.393415.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-01T18-32-26.393415.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-01T18-32-26.393415.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T18-32-26.393415.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T18-32-26.393415.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-01T18-32-26.393415.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T18-32-26.393415.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T18-32-26.393415.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T18-32-26.393415.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T18-32-26.393415.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-01T18-32-26.393415.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-01T18-32-26.393415.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T18-32-26.393415.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-01T18-32-26.393415.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T18-32-26.393415.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T18-32-26.393415.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T18-32-26.393415.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-01T18-32-26.393415.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T18-32-26.393415.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T18-32-26.393415.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T18-32-26.393415.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T18-32-26.393415.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T18-32-26.393415.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T18-32-26.393415.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T18-32-26.393415.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T18-32-26.393415.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T18-32-26.393415.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T18-32-26.393415.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T18-32-26.393415.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T18-32-26.393415.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T18-32-26.393415.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T18-32-26.393415.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-01T18-32-26.393415.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T18-32-26.393415.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-01T18-32-26.393415.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T18-32-26.393415.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T18-32-26.393415.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T18-32-26.393415.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-01T18-32-26.393415.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-01T18-32-26.393415.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T18-32-26.393415.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T18-32-26.393415.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T18-32-26.393415.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T18-32-26.393415.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-01T18-32-26.393415.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-01T18-32-26.393415.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-01T18-32-26.393415.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T18-32-26.393415.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-01T18-32-26.393415.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T18-32-26.393415.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T18-32-26.393415.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-01T18-32-26.393415.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-01T18-32-26.393415.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-01T18-32-26.393415.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T18-32-26.393415.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-01T18-32-26.393415.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-01T18-32-26.393415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T18-32-26.393415.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-01T18-32-26.393415.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-01T18-32-26.393415.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T18-32-26.393415.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T18-32-26.393415.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-01T18-32-26.393415.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T18-32-26.393415.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T18-32-26.393415.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T18-32-26.393415.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T18-32-26.393415.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-01T18-32-26.393415.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-01T18-32-26.393415.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T18-32-26.393415.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-01T18-32-26.393415.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T18-32-26.393415.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T18-32-26.393415.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T18-32-26.393415.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-01T18-32-26.393415.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T18-32-26.393415.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T18-32-26.393415.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T18-32-26.393415.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T18-32-26.393415.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T18-32-26.393415.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T18-32-26.393415.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T18-32-26.393415.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T18-32-26.393415.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T18-32-26.393415.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T18-32-26.393415.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T18-32-26.393415.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T18-32-26.393415.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T18-32-26.393415.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T18-32-26.393415.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-01T18-32-26.393415.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T18-32-26.393415.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-01T18-32-26.393415.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T18-32-26.393415.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T18-32-26.393415.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T18-32-26.393415.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-01T18-32-26.393415.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-01T18-32-26.393415.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T18-32-26.393415.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T18-32-26.393415.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T18-32-26.393415.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T18-32-26.393415.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-01T18-32-26.393415.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-01T18-32-26.393415.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-01T18-32-26.393415.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T18-32-26.393415.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-01T18-32-26.393415.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T18-32-26.393415.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T18-32-26.393415.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-01T18-32-26.393415.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-01T18-32-26.393415.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-01T18-32-26.393415.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T18-32-26.393415.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-01T18-32-26.393415.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-01T18-32-26.393415.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_01T18_32_26.393415", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T18-32-26.393415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T18-32-26.393415.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_01T18_32_26.393415", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-01T18-32-26.393415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-01T18-32-26.393415.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_01T18_32_26.393415", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-01T18-32-26.393415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-01T18-32-26.393415.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_01T18_32_26.393415", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T18-32-26.393415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T18-32-26.393415.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_01T18_32_26.393415", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T18-32-26.393415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T18-32-26.393415.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_01T18_32_26.393415", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-01T18-32-26.393415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-01T18-32-26.393415.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_01T18_32_26.393415", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T18-32-26.393415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T18-32-26.393415.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_01T18_32_26.393415", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T18-32-26.393415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T18-32-26.393415.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_01T18_32_26.393415", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T18-32-26.393415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T18-32-26.393415.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_01T18_32_26.393415", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T18-32-26.393415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T18-32-26.393415.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_01T18_32_26.393415", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-01T18-32-26.393415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-01T18-32-26.393415.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_01T18_32_26.393415", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-01T18-32-26.393415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-01T18-32-26.393415.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_01T18_32_26.393415", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T18-32-26.393415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T18-32-26.393415.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_01T18_32_26.393415", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-01T18-32-26.393415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-01T18-32-26.393415.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_01T18_32_26.393415", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T18-32-26.393415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T18-32-26.393415.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_01T18_32_26.393415", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T18-32-26.393415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T18-32-26.393415.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_01T18_32_26.393415", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T18-32-26.393415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T18-32-26.393415.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_01T18_32_26.393415", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-01T18-32-26.393415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-01T18-32-26.393415.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_01T18_32_26.393415", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T18-32-26.393415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T18-32-26.393415.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_01T18_32_26.393415", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T18-32-26.393415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T18-32-26.393415.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_01T18_32_26.393415", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T18-32-26.393415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T18-32-26.393415.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_01T18_32_26.393415", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T18-32-26.393415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T18-32-26.393415.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_01T18_32_26.393415", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T18-32-26.393415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T18-32-26.393415.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_01T18_32_26.393415", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T18-32-26.393415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T18-32-26.393415.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_01T18_32_26.393415", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T18-32-26.393415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T18-32-26.393415.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_01T18_32_26.393415", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T18-32-26.393415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T18-32-26.393415.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_01T18_32_26.393415", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T18-32-26.393415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T18-32-26.393415.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_01T18_32_26.393415", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T18-32-26.393415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T18-32-26.393415.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_01T18_32_26.393415", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T18-32-26.393415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T18-32-26.393415.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_01T18_32_26.393415", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T18-32-26.393415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T18-32-26.393415.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_01T18_32_26.393415", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T18-32-26.393415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T18-32-26.393415.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_01T18_32_26.393415", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T18-32-26.393415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T18-32-26.393415.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_01T18_32_26.393415", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-01T18-32-26.393415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-01T18-32-26.393415.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_01T18_32_26.393415", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T18-32-26.393415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T18-32-26.393415.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_01T18_32_26.393415", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-01T18-32-26.393415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-01T18-32-26.393415.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_01T18_32_26.393415", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T18-32-26.393415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T18-32-26.393415.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_01T18_32_26.393415", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T18-32-26.393415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T18-32-26.393415.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_01T18_32_26.393415", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T18-32-26.393415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T18-32-26.393415.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_01T18_32_26.393415", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-01T18-32-26.393415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-01T18-32-26.393415.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_01T18_32_26.393415", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-01T18-32-26.393415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-01T18-32-26.393415.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_01T18_32_26.393415", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T18-32-26.393415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T18-32-26.393415.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_01T18_32_26.393415", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T18-32-26.393415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T18-32-26.393415.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_01T18_32_26.393415", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T18-32-26.393415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T18-32-26.393415.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_01T18_32_26.393415", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T18-32-26.393415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T18-32-26.393415.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_01T18_32_26.393415", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-01T18-32-26.393415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-01T18-32-26.393415.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_01T18_32_26.393415", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-01T18-32-26.393415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-01T18-32-26.393415.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_01T18_32_26.393415", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-01T18-32-26.393415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-01T18-32-26.393415.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_01T18_32_26.393415", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T18-32-26.393415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T18-32-26.393415.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_01T18_32_26.393415", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-01T18-32-26.393415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-01T18-32-26.393415.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_01T18_32_26.393415", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T18-32-26.393415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T18-32-26.393415.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_01T18_32_26.393415", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T18-32-26.393415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T18-32-26.393415.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_01T18_32_26.393415", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-01T18-32-26.393415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-01T18-32-26.393415.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_01T18_32_26.393415", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-01T18-32-26.393415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-01T18-32-26.393415.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_01T18_32_26.393415", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-01T18-32-26.393415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-01T18-32-26.393415.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_01T18_32_26.393415", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T18-32-26.393415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T18-32-26.393415.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_01T18_32_26.393415", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-01T18-32-26.393415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-01T18-32-26.393415.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_01T18_32_26.393415", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-01T18-32-26.393415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-01T18-32-26.393415.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_01T18_32_26.393415", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-01T18-32-26.393415.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-01T18-32-26.393415.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_01T18_32_26.393415", "path": ["**/details_harness|winogrande|5_2024-02-01T18-32-26.393415.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-01T18-32-26.393415.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_01T18_32_26.393415", "path": ["results_2024-02-01T18-32-26.393415.parquet"]}, {"split": "latest", "path": ["results_2024-02-01T18-32-26.393415.parquet"]}]}]}
2024-02-01T18:35:13+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of DreadPoor/ToppyEvil-7B-slerp Dataset automatically created during the evaluation run of model DreadPoor/ToppyEvil-7B-slerp on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-02-01T18:32:26.393415(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of DreadPoor/ToppyEvil-7B-slerp\n\n\n\nDataset automatically created during the evaluation run of model DreadPoor/ToppyEvil-7B-slerp on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-02-01T18:32:26.393415(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of DreadPoor/ToppyEvil-7B-slerp\n\n\n\nDataset automatically created during the evaluation run of model DreadPoor/ToppyEvil-7B-slerp on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-02-01T18:32:26.393415(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
25aa1db8c3eeafea7e760813dcdebdecea21456f
# Dataset Card for Evaluation run of fionazhang/fine-tune-mistral-long-merge <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [fionazhang/fine-tune-mistral-long-merge](https://huggingface.co/fionazhang/fine-tune-mistral-long-merge) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_fionazhang__fine-tune-mistral-long-merge", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-02-01T18:38:59.873135](https://huggingface.co/datasets/open-llm-leaderboard/details_fionazhang__fine-tune-mistral-long-merge/blob/main/results_2024-02-01T18-38-59.873135.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6310974603167736, "acc_stderr": 0.03251926531091339, "acc_norm": 0.6372662374519631, "acc_norm_stderr": 0.03317893564792818, "mc1": 0.2937576499388005, "mc1_stderr": 0.015945068581236614, "mc2": 0.4393573192333758, "mc2_stderr": 0.014110064746912822 }, "harness|arc:challenge|25": { "acc": 0.5767918088737202, "acc_stderr": 0.01443803622084803, "acc_norm": 0.628839590443686, "acc_norm_stderr": 0.014117971901142824 }, "harness|hellaswag|10": { "acc": 0.6363274248157738, "acc_stderr": 0.004800728138792393, "acc_norm": 0.8361880103565027, "acc_norm_stderr": 0.003693484894179418 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.28, "acc_stderr": 0.04512608598542129, "acc_norm": 0.28, "acc_norm_stderr": 0.04512608598542129 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.6, "acc_stderr": 0.04232073695151589, "acc_norm": 0.6, "acc_norm_stderr": 0.04232073695151589 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.625, "acc_stderr": 0.039397364351956274, "acc_norm": 0.625, "acc_norm_stderr": 0.039397364351956274 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.55, "acc_stderr": 0.05, "acc_norm": 0.55, "acc_norm_stderr": 0.05 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.6641509433962264, "acc_stderr": 0.029067220146644826, "acc_norm": 0.6641509433962264, "acc_norm_stderr": 0.029067220146644826 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.7083333333333334, "acc_stderr": 0.038009680605548594, "acc_norm": 0.7083333333333334, "acc_norm_stderr": 0.038009680605548594 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.47, "acc_stderr": 0.05016135580465919, "acc_norm": 0.47, "acc_norm_stderr": 0.05016135580465919 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.56, "acc_stderr": 0.04988876515698589, "acc_norm": 0.56, "acc_norm_stderr": 0.04988876515698589 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.38, "acc_stderr": 0.04878317312145633, "acc_norm": 0.38, "acc_norm_stderr": 0.04878317312145633 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6647398843930635, "acc_stderr": 0.03599586301247077, "acc_norm": 0.6647398843930635, "acc_norm_stderr": 0.03599586301247077 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.38235294117647056, "acc_stderr": 0.04835503696107223, "acc_norm": 0.38235294117647056, "acc_norm_stderr": 0.04835503696107223 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.75, "acc_stderr": 0.04351941398892446, "acc_norm": 0.75, "acc_norm_stderr": 0.04351941398892446 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.5702127659574469, "acc_stderr": 0.03236214467715564, "acc_norm": 0.5702127659574469, "acc_norm_stderr": 0.03236214467715564 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.49122807017543857, "acc_stderr": 0.04702880432049615, "acc_norm": 0.49122807017543857, "acc_norm_stderr": 0.04702880432049615 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5655172413793104, "acc_stderr": 0.04130740879555497, "acc_norm": 0.5655172413793104, "acc_norm_stderr": 0.04130740879555497 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.3888888888888889, "acc_stderr": 0.025107425481137282, "acc_norm": 0.3888888888888889, "acc_norm_stderr": 0.025107425481137282 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.4365079365079365, "acc_stderr": 0.04435932892851466, "acc_norm": 0.4365079365079365, "acc_norm_stderr": 0.04435932892851466 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.37, "acc_stderr": 0.04852365870939099, "acc_norm": 0.37, "acc_norm_stderr": 0.04852365870939099 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.7516129032258064, "acc_stderr": 0.024580028921481003, "acc_norm": 0.7516129032258064, "acc_norm_stderr": 0.024580028921481003 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.49261083743842365, "acc_stderr": 0.035176035403610084, "acc_norm": 0.49261083743842365, "acc_norm_stderr": 0.035176035403610084 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.67, "acc_stderr": 0.04725815626252607, "acc_norm": 0.67, "acc_norm_stderr": 0.04725815626252607 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7454545454545455, "acc_stderr": 0.03401506715249039, "acc_norm": 0.7454545454545455, "acc_norm_stderr": 0.03401506715249039 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.7727272727272727, "acc_stderr": 0.029857515673386414, "acc_norm": 0.7727272727272727, "acc_norm_stderr": 0.029857515673386414 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.8704663212435233, "acc_stderr": 0.02423353229775873, "acc_norm": 0.8704663212435233, "acc_norm_stderr": 0.02423353229775873 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.6384615384615384, "acc_stderr": 0.024359581465397, "acc_norm": 0.6384615384615384, "acc_norm_stderr": 0.024359581465397 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.37037037037037035, "acc_stderr": 0.02944316932303154, "acc_norm": 0.37037037037037035, "acc_norm_stderr": 0.02944316932303154 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.6680672268907563, "acc_stderr": 0.03058869701378364, "acc_norm": 0.6680672268907563, "acc_norm_stderr": 0.03058869701378364 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.31125827814569534, "acc_stderr": 0.03780445850526733, "acc_norm": 0.31125827814569534, "acc_norm_stderr": 0.03780445850526733 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8256880733944955, "acc_stderr": 0.016265675632010354, "acc_norm": 0.8256880733944955, "acc_norm_stderr": 0.016265675632010354 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.5601851851851852, "acc_stderr": 0.0338517797604481, "acc_norm": 0.5601851851851852, "acc_norm_stderr": 0.0338517797604481 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.7941176470588235, "acc_stderr": 0.028379449451588667, "acc_norm": 0.7941176470588235, "acc_norm_stderr": 0.028379449451588667 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.7552742616033755, "acc_stderr": 0.02798569938703643, "acc_norm": 0.7552742616033755, "acc_norm_stderr": 0.02798569938703643 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6816143497757847, "acc_stderr": 0.03126580522513713, "acc_norm": 0.6816143497757847, "acc_norm_stderr": 0.03126580522513713 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.7633587786259542, "acc_stderr": 0.03727673575596914, "acc_norm": 0.7633587786259542, "acc_norm_stderr": 0.03727673575596914 }, "harness|hendrycksTest-international_law|5": { "acc": 0.7933884297520661, "acc_stderr": 0.03695980128098824, "acc_norm": 0.7933884297520661, "acc_norm_stderr": 0.03695980128098824 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7222222222222222, "acc_stderr": 0.04330043749650743, "acc_norm": 0.7222222222222222, "acc_norm_stderr": 0.04330043749650743 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.8098159509202454, "acc_stderr": 0.03083349114628124, "acc_norm": 0.8098159509202454, "acc_norm_stderr": 0.03083349114628124 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.4642857142857143, "acc_stderr": 0.04733667890053756, "acc_norm": 0.4642857142857143, "acc_norm_stderr": 0.04733667890053756 }, "harness|hendrycksTest-management|5": { "acc": 0.7864077669902912, "acc_stderr": 0.040580420156460344, "acc_norm": 0.7864077669902912, "acc_norm_stderr": 0.040580420156460344 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8717948717948718, "acc_stderr": 0.02190190511507333, "acc_norm": 0.8717948717948718, "acc_norm_stderr": 0.02190190511507333 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.75, "acc_stderr": 0.04351941398892446, "acc_norm": 0.75, "acc_norm_stderr": 0.04351941398892446 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8173690932311622, "acc_stderr": 0.013816335389973133, "acc_norm": 0.8173690932311622, "acc_norm_stderr": 0.013816335389973133 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.7138728323699421, "acc_stderr": 0.02433214677913413, "acc_norm": 0.7138728323699421, "acc_norm_stderr": 0.02433214677913413 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.39217877094972065, "acc_stderr": 0.01632906107320744, "acc_norm": 0.39217877094972065, "acc_norm_stderr": 0.01632906107320744 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.7647058823529411, "acc_stderr": 0.0242886194660461, "acc_norm": 0.7647058823529411, "acc_norm_stderr": 0.0242886194660461 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.684887459807074, "acc_stderr": 0.026385273703464482, "acc_norm": 0.684887459807074, "acc_norm_stderr": 0.026385273703464482 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.7067901234567902, "acc_stderr": 0.025329888171900926, "acc_norm": 0.7067901234567902, "acc_norm_stderr": 0.025329888171900926 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.49645390070921985, "acc_stderr": 0.02982674915328092, "acc_norm": 0.49645390070921985, "acc_norm_stderr": 0.02982674915328092 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.44198174706649285, "acc_stderr": 0.01268397251359881, "acc_norm": 0.44198174706649285, "acc_norm_stderr": 0.01268397251359881 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.6875, "acc_stderr": 0.02815637344037142, "acc_norm": 0.6875, "acc_norm_stderr": 0.02815637344037142 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6503267973856209, "acc_stderr": 0.01929196189506638, "acc_norm": 0.6503267973856209, "acc_norm_stderr": 0.01929196189506638 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6818181818181818, "acc_stderr": 0.044612721759105085, "acc_norm": 0.6818181818181818, "acc_norm_stderr": 0.044612721759105085 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.7142857142857143, "acc_stderr": 0.0289205832206756, "acc_norm": 0.7142857142857143, "acc_norm_stderr": 0.0289205832206756 }, "harness|hendrycksTest-sociology|5": { "acc": 0.8109452736318408, "acc_stderr": 0.02768691358801302, "acc_norm": 0.8109452736318408, "acc_norm_stderr": 0.02768691358801302 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.86, "acc_stderr": 0.034873508801977704, "acc_norm": 0.86, "acc_norm_stderr": 0.034873508801977704 }, "harness|hendrycksTest-virology|5": { "acc": 0.5481927710843374, "acc_stderr": 0.03874371556587953, "acc_norm": 0.5481927710843374, "acc_norm_stderr": 0.03874371556587953 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8362573099415205, "acc_stderr": 0.028380919596145866, "acc_norm": 0.8362573099415205, "acc_norm_stderr": 0.028380919596145866 }, "harness|truthfulqa:mc|0": { "mc1": 0.2937576499388005, "mc1_stderr": 0.015945068581236614, "mc2": 0.4393573192333758, "mc2_stderr": 0.014110064746912822 }, "harness|winogrande|5": { "acc": 0.7892659826361483, "acc_stderr": 0.011462046419710674 }, "harness|gsm8k|5": { "acc": 0.36087945413191813, "acc_stderr": 0.013228626753925138 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_fionazhang__fine-tune-mistral-long-merge
[ "region:us" ]
2024-02-01T18:41:19+00:00
{"pretty_name": "Evaluation run of fionazhang/fine-tune-mistral-long-merge", "dataset_summary": "Dataset automatically created during the evaluation run of model [fionazhang/fine-tune-mistral-long-merge](https://huggingface.co/fionazhang/fine-tune-mistral-long-merge) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_fionazhang__fine-tune-mistral-long-merge\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-01T18:38:59.873135](https://huggingface.co/datasets/open-llm-leaderboard/details_fionazhang__fine-tune-mistral-long-merge/blob/main/results_2024-02-01T18-38-59.873135.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6310974603167736,\n \"acc_stderr\": 0.03251926531091339,\n \"acc_norm\": 0.6372662374519631,\n \"acc_norm_stderr\": 0.03317893564792818,\n \"mc1\": 0.2937576499388005,\n \"mc1_stderr\": 0.015945068581236614,\n \"mc2\": 0.4393573192333758,\n \"mc2_stderr\": 0.014110064746912822\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5767918088737202,\n \"acc_stderr\": 0.01443803622084803,\n \"acc_norm\": 0.628839590443686,\n \"acc_norm_stderr\": 0.014117971901142824\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6363274248157738,\n \"acc_stderr\": 0.004800728138792393,\n \"acc_norm\": 0.8361880103565027,\n \"acc_norm_stderr\": 0.003693484894179418\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542129,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542129\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.04232073695151589,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.04232073695151589\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.625,\n \"acc_stderr\": 0.039397364351956274,\n \"acc_norm\": 0.625,\n \"acc_norm_stderr\": 0.039397364351956274\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6641509433962264,\n \"acc_stderr\": 0.029067220146644826,\n \"acc_norm\": 0.6641509433962264,\n \"acc_norm_stderr\": 0.029067220146644826\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7083333333333334,\n \"acc_stderr\": 0.038009680605548594,\n \"acc_norm\": 0.7083333333333334,\n \"acc_norm_stderr\": 0.038009680605548594\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145633,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145633\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6647398843930635,\n \"acc_stderr\": 0.03599586301247077,\n \"acc_norm\": 0.6647398843930635,\n \"acc_norm_stderr\": 0.03599586301247077\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107223,\n \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107223\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5702127659574469,\n \"acc_stderr\": 0.03236214467715564,\n \"acc_norm\": 0.5702127659574469,\n \"acc_norm_stderr\": 0.03236214467715564\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.49122807017543857,\n \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.49122807017543857,\n \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5655172413793104,\n \"acc_stderr\": 0.04130740879555497,\n \"acc_norm\": 0.5655172413793104,\n \"acc_norm_stderr\": 0.04130740879555497\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3888888888888889,\n \"acc_stderr\": 0.025107425481137282,\n \"acc_norm\": 0.3888888888888889,\n \"acc_norm_stderr\": 0.025107425481137282\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4365079365079365,\n \"acc_stderr\": 0.04435932892851466,\n \"acc_norm\": 0.4365079365079365,\n \"acc_norm_stderr\": 0.04435932892851466\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7516129032258064,\n \"acc_stderr\": 0.024580028921481003,\n \"acc_norm\": 0.7516129032258064,\n \"acc_norm_stderr\": 0.024580028921481003\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.49261083743842365,\n \"acc_stderr\": 0.035176035403610084,\n \"acc_norm\": 0.49261083743842365,\n \"acc_norm_stderr\": 0.035176035403610084\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.67,\n \"acc_stderr\": 0.04725815626252607,\n \"acc_norm\": 0.67,\n \"acc_norm_stderr\": 0.04725815626252607\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7454545454545455,\n \"acc_stderr\": 0.03401506715249039,\n \"acc_norm\": 0.7454545454545455,\n \"acc_norm_stderr\": 0.03401506715249039\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7727272727272727,\n \"acc_stderr\": 0.029857515673386414,\n \"acc_norm\": 0.7727272727272727,\n \"acc_norm_stderr\": 0.029857515673386414\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8704663212435233,\n \"acc_stderr\": 0.02423353229775873,\n \"acc_norm\": 0.8704663212435233,\n \"acc_norm_stderr\": 0.02423353229775873\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6384615384615384,\n \"acc_stderr\": 0.024359581465397,\n \"acc_norm\": 0.6384615384615384,\n \"acc_norm_stderr\": 0.024359581465397\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.37037037037037035,\n \"acc_stderr\": 0.02944316932303154,\n \"acc_norm\": 0.37037037037037035,\n \"acc_norm_stderr\": 0.02944316932303154\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6680672268907563,\n \"acc_stderr\": 0.03058869701378364,\n \"acc_norm\": 0.6680672268907563,\n \"acc_norm_stderr\": 0.03058869701378364\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.31125827814569534,\n \"acc_stderr\": 0.03780445850526733,\n \"acc_norm\": 0.31125827814569534,\n \"acc_norm_stderr\": 0.03780445850526733\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8256880733944955,\n \"acc_stderr\": 0.016265675632010354,\n \"acc_norm\": 0.8256880733944955,\n \"acc_norm_stderr\": 0.016265675632010354\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5601851851851852,\n \"acc_stderr\": 0.0338517797604481,\n \"acc_norm\": 0.5601851851851852,\n \"acc_norm_stderr\": 0.0338517797604481\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7941176470588235,\n \"acc_stderr\": 0.028379449451588667,\n \"acc_norm\": 0.7941176470588235,\n \"acc_norm_stderr\": 0.028379449451588667\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7552742616033755,\n \"acc_stderr\": 0.02798569938703643,\n \"acc_norm\": 0.7552742616033755,\n \"acc_norm_stderr\": 0.02798569938703643\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6816143497757847,\n \"acc_stderr\": 0.03126580522513713,\n \"acc_norm\": 0.6816143497757847,\n \"acc_norm_stderr\": 0.03126580522513713\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7633587786259542,\n \"acc_stderr\": 0.03727673575596914,\n \"acc_norm\": 0.7633587786259542,\n \"acc_norm_stderr\": 0.03727673575596914\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7933884297520661,\n \"acc_stderr\": 0.03695980128098824,\n \"acc_norm\": 0.7933884297520661,\n \"acc_norm_stderr\": 0.03695980128098824\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.04330043749650743,\n \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.04330043749650743\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.8098159509202454,\n \"acc_stderr\": 0.03083349114628124,\n \"acc_norm\": 0.8098159509202454,\n \"acc_norm_stderr\": 0.03083349114628124\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4642857142857143,\n \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.4642857142857143,\n \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7864077669902912,\n \"acc_stderr\": 0.040580420156460344,\n \"acc_norm\": 0.7864077669902912,\n \"acc_norm_stderr\": 0.040580420156460344\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8717948717948718,\n \"acc_stderr\": 0.02190190511507333,\n \"acc_norm\": 0.8717948717948718,\n \"acc_norm_stderr\": 0.02190190511507333\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8173690932311622,\n \"acc_stderr\": 0.013816335389973133,\n \"acc_norm\": 0.8173690932311622,\n \"acc_norm_stderr\": 0.013816335389973133\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7138728323699421,\n \"acc_stderr\": 0.02433214677913413,\n \"acc_norm\": 0.7138728323699421,\n \"acc_norm_stderr\": 0.02433214677913413\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.39217877094972065,\n \"acc_stderr\": 0.01632906107320744,\n \"acc_norm\": 0.39217877094972065,\n \"acc_norm_stderr\": 0.01632906107320744\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7647058823529411,\n \"acc_stderr\": 0.0242886194660461,\n \"acc_norm\": 0.7647058823529411,\n \"acc_norm_stderr\": 0.0242886194660461\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.684887459807074,\n \"acc_stderr\": 0.026385273703464482,\n \"acc_norm\": 0.684887459807074,\n \"acc_norm_stderr\": 0.026385273703464482\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7067901234567902,\n \"acc_stderr\": 0.025329888171900926,\n \"acc_norm\": 0.7067901234567902,\n \"acc_norm_stderr\": 0.025329888171900926\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.49645390070921985,\n \"acc_stderr\": 0.02982674915328092,\n \"acc_norm\": 0.49645390070921985,\n \"acc_norm_stderr\": 0.02982674915328092\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.44198174706649285,\n \"acc_stderr\": 0.01268397251359881,\n \"acc_norm\": 0.44198174706649285,\n \"acc_norm_stderr\": 0.01268397251359881\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6875,\n \"acc_stderr\": 0.02815637344037142,\n \"acc_norm\": 0.6875,\n \"acc_norm_stderr\": 0.02815637344037142\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6503267973856209,\n \"acc_stderr\": 0.01929196189506638,\n \"acc_norm\": 0.6503267973856209,\n \"acc_norm_stderr\": 0.01929196189506638\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n \"acc_stderr\": 0.044612721759105085,\n \"acc_norm\": 0.6818181818181818,\n \"acc_norm_stderr\": 0.044612721759105085\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7142857142857143,\n \"acc_stderr\": 0.0289205832206756,\n \"acc_norm\": 0.7142857142857143,\n \"acc_norm_stderr\": 0.0289205832206756\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8109452736318408,\n \"acc_stderr\": 0.02768691358801302,\n \"acc_norm\": 0.8109452736318408,\n \"acc_norm_stderr\": 0.02768691358801302\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.86,\n \"acc_stderr\": 0.034873508801977704,\n \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.034873508801977704\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5481927710843374,\n \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.5481927710843374,\n \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2937576499388005,\n \"mc1_stderr\": 0.015945068581236614,\n \"mc2\": 0.4393573192333758,\n \"mc2_stderr\": 0.014110064746912822\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7892659826361483,\n \"acc_stderr\": 0.011462046419710674\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.36087945413191813,\n \"acc_stderr\": 0.013228626753925138\n }\n}\n```", "repo_url": "https://huggingface.co/fionazhang/fine-tune-mistral-long-merge", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_01T18_38_59.873135", "path": ["**/details_harness|arc:challenge|25_2024-02-01T18-38-59.873135.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-01T18-38-59.873135.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_01T18_38_59.873135", "path": ["**/details_harness|gsm8k|5_2024-02-01T18-38-59.873135.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-01T18-38-59.873135.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_01T18_38_59.873135", "path": ["**/details_harness|hellaswag|10_2024-02-01T18-38-59.873135.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-01T18-38-59.873135.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_01T18_38_59.873135", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T18-38-59.873135.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-01T18-38-59.873135.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-01T18-38-59.873135.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T18-38-59.873135.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T18-38-59.873135.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-01T18-38-59.873135.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T18-38-59.873135.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T18-38-59.873135.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T18-38-59.873135.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T18-38-59.873135.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-01T18-38-59.873135.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-01T18-38-59.873135.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T18-38-59.873135.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-01T18-38-59.873135.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T18-38-59.873135.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T18-38-59.873135.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T18-38-59.873135.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-01T18-38-59.873135.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T18-38-59.873135.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T18-38-59.873135.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T18-38-59.873135.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T18-38-59.873135.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T18-38-59.873135.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T18-38-59.873135.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T18-38-59.873135.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T18-38-59.873135.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T18-38-59.873135.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T18-38-59.873135.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T18-38-59.873135.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T18-38-59.873135.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T18-38-59.873135.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T18-38-59.873135.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-01T18-38-59.873135.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T18-38-59.873135.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-01T18-38-59.873135.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T18-38-59.873135.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T18-38-59.873135.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T18-38-59.873135.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-01T18-38-59.873135.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-01T18-38-59.873135.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T18-38-59.873135.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T18-38-59.873135.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T18-38-59.873135.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T18-38-59.873135.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-01T18-38-59.873135.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-01T18-38-59.873135.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-01T18-38-59.873135.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T18-38-59.873135.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-01T18-38-59.873135.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T18-38-59.873135.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T18-38-59.873135.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-01T18-38-59.873135.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-01T18-38-59.873135.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-01T18-38-59.873135.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T18-38-59.873135.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-01T18-38-59.873135.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-01T18-38-59.873135.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T18-38-59.873135.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-01T18-38-59.873135.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-01T18-38-59.873135.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T18-38-59.873135.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T18-38-59.873135.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-01T18-38-59.873135.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T18-38-59.873135.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T18-38-59.873135.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T18-38-59.873135.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T18-38-59.873135.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-01T18-38-59.873135.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-01T18-38-59.873135.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T18-38-59.873135.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-01T18-38-59.873135.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T18-38-59.873135.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T18-38-59.873135.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T18-38-59.873135.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-01T18-38-59.873135.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T18-38-59.873135.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T18-38-59.873135.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T18-38-59.873135.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T18-38-59.873135.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T18-38-59.873135.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T18-38-59.873135.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T18-38-59.873135.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T18-38-59.873135.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T18-38-59.873135.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T18-38-59.873135.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T18-38-59.873135.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T18-38-59.873135.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T18-38-59.873135.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T18-38-59.873135.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-01T18-38-59.873135.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T18-38-59.873135.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-01T18-38-59.873135.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T18-38-59.873135.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T18-38-59.873135.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T18-38-59.873135.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-01T18-38-59.873135.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-01T18-38-59.873135.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T18-38-59.873135.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T18-38-59.873135.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T18-38-59.873135.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T18-38-59.873135.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-01T18-38-59.873135.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-01T18-38-59.873135.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-01T18-38-59.873135.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T18-38-59.873135.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-01T18-38-59.873135.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T18-38-59.873135.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T18-38-59.873135.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-01T18-38-59.873135.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-01T18-38-59.873135.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-01T18-38-59.873135.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T18-38-59.873135.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-01T18-38-59.873135.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-01T18-38-59.873135.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_01T18_38_59.873135", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T18-38-59.873135.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T18-38-59.873135.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_01T18_38_59.873135", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-01T18-38-59.873135.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-01T18-38-59.873135.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_01T18_38_59.873135", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-01T18-38-59.873135.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-01T18-38-59.873135.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_01T18_38_59.873135", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T18-38-59.873135.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T18-38-59.873135.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_01T18_38_59.873135", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T18-38-59.873135.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T18-38-59.873135.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_01T18_38_59.873135", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-01T18-38-59.873135.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-01T18-38-59.873135.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_01T18_38_59.873135", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T18-38-59.873135.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T18-38-59.873135.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_01T18_38_59.873135", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T18-38-59.873135.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T18-38-59.873135.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_01T18_38_59.873135", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T18-38-59.873135.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T18-38-59.873135.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_01T18_38_59.873135", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T18-38-59.873135.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T18-38-59.873135.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_01T18_38_59.873135", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-01T18-38-59.873135.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-01T18-38-59.873135.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_01T18_38_59.873135", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-01T18-38-59.873135.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-01T18-38-59.873135.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_01T18_38_59.873135", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T18-38-59.873135.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T18-38-59.873135.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_01T18_38_59.873135", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-01T18-38-59.873135.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-01T18-38-59.873135.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_01T18_38_59.873135", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T18-38-59.873135.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T18-38-59.873135.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_01T18_38_59.873135", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T18-38-59.873135.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T18-38-59.873135.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_01T18_38_59.873135", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T18-38-59.873135.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T18-38-59.873135.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_01T18_38_59.873135", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-01T18-38-59.873135.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-01T18-38-59.873135.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_01T18_38_59.873135", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T18-38-59.873135.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T18-38-59.873135.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_01T18_38_59.873135", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T18-38-59.873135.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T18-38-59.873135.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_01T18_38_59.873135", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T18-38-59.873135.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T18-38-59.873135.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_01T18_38_59.873135", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T18-38-59.873135.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T18-38-59.873135.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_01T18_38_59.873135", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T18-38-59.873135.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T18-38-59.873135.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_01T18_38_59.873135", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T18-38-59.873135.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T18-38-59.873135.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_01T18_38_59.873135", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T18-38-59.873135.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T18-38-59.873135.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_01T18_38_59.873135", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T18-38-59.873135.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T18-38-59.873135.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_01T18_38_59.873135", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T18-38-59.873135.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T18-38-59.873135.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_01T18_38_59.873135", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T18-38-59.873135.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T18-38-59.873135.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_01T18_38_59.873135", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T18-38-59.873135.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T18-38-59.873135.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_01T18_38_59.873135", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T18-38-59.873135.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T18-38-59.873135.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_01T18_38_59.873135", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T18-38-59.873135.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T18-38-59.873135.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_01T18_38_59.873135", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T18-38-59.873135.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T18-38-59.873135.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_01T18_38_59.873135", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-01T18-38-59.873135.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-01T18-38-59.873135.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_01T18_38_59.873135", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T18-38-59.873135.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T18-38-59.873135.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_01T18_38_59.873135", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-01T18-38-59.873135.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-01T18-38-59.873135.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_01T18_38_59.873135", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T18-38-59.873135.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T18-38-59.873135.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_01T18_38_59.873135", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T18-38-59.873135.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T18-38-59.873135.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_01T18_38_59.873135", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T18-38-59.873135.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T18-38-59.873135.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_01T18_38_59.873135", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-01T18-38-59.873135.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-01T18-38-59.873135.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_01T18_38_59.873135", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-01T18-38-59.873135.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-01T18-38-59.873135.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_01T18_38_59.873135", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T18-38-59.873135.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T18-38-59.873135.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_01T18_38_59.873135", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T18-38-59.873135.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T18-38-59.873135.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_01T18_38_59.873135", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T18-38-59.873135.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T18-38-59.873135.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_01T18_38_59.873135", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T18-38-59.873135.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T18-38-59.873135.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_01T18_38_59.873135", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-01T18-38-59.873135.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-01T18-38-59.873135.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_01T18_38_59.873135", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-01T18-38-59.873135.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-01T18-38-59.873135.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_01T18_38_59.873135", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-01T18-38-59.873135.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-01T18-38-59.873135.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_01T18_38_59.873135", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T18-38-59.873135.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T18-38-59.873135.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_01T18_38_59.873135", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-01T18-38-59.873135.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-01T18-38-59.873135.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_01T18_38_59.873135", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T18-38-59.873135.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T18-38-59.873135.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_01T18_38_59.873135", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T18-38-59.873135.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T18-38-59.873135.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_01T18_38_59.873135", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-01T18-38-59.873135.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-01T18-38-59.873135.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_01T18_38_59.873135", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-01T18-38-59.873135.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-01T18-38-59.873135.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_01T18_38_59.873135", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-01T18-38-59.873135.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-01T18-38-59.873135.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_01T18_38_59.873135", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T18-38-59.873135.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T18-38-59.873135.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_01T18_38_59.873135", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-01T18-38-59.873135.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-01T18-38-59.873135.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_01T18_38_59.873135", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-01T18-38-59.873135.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-01T18-38-59.873135.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_01T18_38_59.873135", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-01T18-38-59.873135.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-01T18-38-59.873135.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_01T18_38_59.873135", "path": ["**/details_harness|winogrande|5_2024-02-01T18-38-59.873135.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-01T18-38-59.873135.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_01T18_38_59.873135", "path": ["results_2024-02-01T18-38-59.873135.parquet"]}, {"split": "latest", "path": ["results_2024-02-01T18-38-59.873135.parquet"]}]}]}
2024-02-01T18:41:53+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of fionazhang/fine-tune-mistral-long-merge Dataset automatically created during the evaluation run of model fionazhang/fine-tune-mistral-long-merge on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-02-01T18:38:59.873135(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of fionazhang/fine-tune-mistral-long-merge\n\n\n\nDataset automatically created during the evaluation run of model fionazhang/fine-tune-mistral-long-merge on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-02-01T18:38:59.873135(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of fionazhang/fine-tune-mistral-long-merge\n\n\n\nDataset automatically created during the evaluation run of model fionazhang/fine-tune-mistral-long-merge on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-02-01T18:38:59.873135(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
ccda23e2c1a77612a40add84303fcf902d59ee29
# Dataset Card for Evaluation run of CultriX/Wernicke-7B-v9 <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [CultriX/Wernicke-7B-v9](https://huggingface.co/CultriX/Wernicke-7B-v9) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_CultriX__Wernicke-7B-v9", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-02-01T18:44:19.770806](https://huggingface.co/datasets/open-llm-leaderboard/details_CultriX__Wernicke-7B-v9/blob/main/results_2024-02-01T18-44-19.770806.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6546993526205855, "acc_stderr": 0.032063919741404885, "acc_norm": 0.6543080706718742, "acc_norm_stderr": 0.0327327760979397, "mc1": 0.5605875152998776, "mc1_stderr": 0.0173745204825137, "mc2": 0.7185564600293038, "mc2_stderr": 0.014650212605887704 }, "harness|arc:challenge|25": { "acc": 0.7022184300341296, "acc_stderr": 0.013363080107244482, "acc_norm": 0.7244027303754266, "acc_norm_stderr": 0.01305716965576184 }, "harness|hellaswag|10": { "acc": 0.7065325632344155, "acc_stderr": 0.004544201359074618, "acc_norm": 0.885381398127863, "acc_norm_stderr": 0.003179100565887989 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.34, "acc_stderr": 0.04760952285695235, "acc_norm": 0.34, "acc_norm_stderr": 0.04760952285695235 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.6666666666666666, "acc_stderr": 0.04072314811876837, "acc_norm": 0.6666666666666666, "acc_norm_stderr": 0.04072314811876837 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.6973684210526315, "acc_stderr": 0.03738520676119669, "acc_norm": 0.6973684210526315, "acc_norm_stderr": 0.03738520676119669 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.63, "acc_stderr": 0.04852365870939099, "acc_norm": 0.63, "acc_norm_stderr": 0.04852365870939099 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.7056603773584905, "acc_stderr": 0.02804918631569525, "acc_norm": 0.7056603773584905, "acc_norm_stderr": 0.02804918631569525 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.7708333333333334, "acc_stderr": 0.03514697467862388, "acc_norm": 0.7708333333333334, "acc_norm_stderr": 0.03514697467862388 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.47, "acc_stderr": 0.050161355804659205, "acc_norm": 0.47, "acc_norm_stderr": 0.050161355804659205 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.52, "acc_stderr": 0.050211673156867795, "acc_norm": 0.52, "acc_norm_stderr": 0.050211673156867795 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.35, "acc_stderr": 0.047937248544110196, "acc_norm": 0.35, "acc_norm_stderr": 0.047937248544110196 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6820809248554913, "acc_stderr": 0.0355068398916558, "acc_norm": 0.6820809248554913, "acc_norm_stderr": 0.0355068398916558 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.43137254901960786, "acc_stderr": 0.04928099597287534, "acc_norm": 0.43137254901960786, "acc_norm_stderr": 0.04928099597287534 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.77, "acc_stderr": 0.042295258468165065, "acc_norm": 0.77, "acc_norm_stderr": 0.042295258468165065 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.5702127659574469, "acc_stderr": 0.03236214467715564, "acc_norm": 0.5702127659574469, "acc_norm_stderr": 0.03236214467715564 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.47368421052631576, "acc_stderr": 0.046970851366478626, "acc_norm": 0.47368421052631576, "acc_norm_stderr": 0.046970851366478626 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5517241379310345, "acc_stderr": 0.04144311810878152, "acc_norm": 0.5517241379310345, "acc_norm_stderr": 0.04144311810878152 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.4312169312169312, "acc_stderr": 0.025506481698138215, "acc_norm": 0.4312169312169312, "acc_norm_stderr": 0.025506481698138215 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.47619047619047616, "acc_stderr": 0.04467062628403273, "acc_norm": 0.47619047619047616, "acc_norm_stderr": 0.04467062628403273 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.33, "acc_stderr": 0.047258156262526045, "acc_norm": 0.33, "acc_norm_stderr": 0.047258156262526045 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.7838709677419354, "acc_stderr": 0.02341529343356852, "acc_norm": 0.7838709677419354, "acc_norm_stderr": 0.02341529343356852 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.49261083743842365, "acc_stderr": 0.035176035403610084, "acc_norm": 0.49261083743842365, "acc_norm_stderr": 0.035176035403610084 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.69, "acc_stderr": 0.04648231987117316, "acc_norm": 0.69, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7696969696969697, "acc_stderr": 0.0328766675860349, "acc_norm": 0.7696969696969697, "acc_norm_stderr": 0.0328766675860349 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.797979797979798, "acc_stderr": 0.02860620428922987, "acc_norm": 0.797979797979798, "acc_norm_stderr": 0.02860620428922987 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.8963730569948186, "acc_stderr": 0.02199531196364424, "acc_norm": 0.8963730569948186, "acc_norm_stderr": 0.02199531196364424 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.6717948717948717, "acc_stderr": 0.023807633198657266, "acc_norm": 0.6717948717948717, "acc_norm_stderr": 0.023807633198657266 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.3333333333333333, "acc_stderr": 0.028742040903948485, "acc_norm": 0.3333333333333333, "acc_norm_stderr": 0.028742040903948485 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.6722689075630253, "acc_stderr": 0.03048991141767323, "acc_norm": 0.6722689075630253, "acc_norm_stderr": 0.03048991141767323 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.3708609271523179, "acc_stderr": 0.03943966699183629, "acc_norm": 0.3708609271523179, "acc_norm_stderr": 0.03943966699183629 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8458715596330275, "acc_stderr": 0.015480826865374303, "acc_norm": 0.8458715596330275, "acc_norm_stderr": 0.015480826865374303 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.5046296296296297, "acc_stderr": 0.03409825519163572, "acc_norm": 0.5046296296296297, "acc_norm_stderr": 0.03409825519163572 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.8382352941176471, "acc_stderr": 0.025845017986926917, "acc_norm": 0.8382352941176471, "acc_norm_stderr": 0.025845017986926917 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.8059071729957806, "acc_stderr": 0.025744902532290902, "acc_norm": 0.8059071729957806, "acc_norm_stderr": 0.025744902532290902 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6860986547085202, "acc_stderr": 0.031146796482972465, "acc_norm": 0.6860986547085202, "acc_norm_stderr": 0.031146796482972465 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.8015267175572519, "acc_stderr": 0.034981493854624714, "acc_norm": 0.8015267175572519, "acc_norm_stderr": 0.034981493854624714 }, "harness|hendrycksTest-international_law|5": { "acc": 0.7851239669421488, "acc_stderr": 0.037494924487096966, "acc_norm": 0.7851239669421488, "acc_norm_stderr": 0.037494924487096966 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.75, "acc_stderr": 0.04186091791394607, "acc_norm": 0.75, "acc_norm_stderr": 0.04186091791394607 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7607361963190185, "acc_stderr": 0.0335195387952127, "acc_norm": 0.7607361963190185, "acc_norm_stderr": 0.0335195387952127 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.48214285714285715, "acc_stderr": 0.047427623612430116, "acc_norm": 0.48214285714285715, "acc_norm_stderr": 0.047427623612430116 }, "harness|hendrycksTest-management|5": { "acc": 0.7864077669902912, "acc_stderr": 0.040580420156460344, "acc_norm": 0.7864077669902912, "acc_norm_stderr": 0.040580420156460344 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8760683760683761, "acc_stderr": 0.021586494001281365, "acc_norm": 0.8760683760683761, "acc_norm_stderr": 0.021586494001281365 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.75, "acc_stderr": 0.04351941398892446, "acc_norm": 0.75, "acc_norm_stderr": 0.04351941398892446 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8288633461047255, "acc_stderr": 0.013468201614066307, "acc_norm": 0.8288633461047255, "acc_norm_stderr": 0.013468201614066307 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.7369942196531792, "acc_stderr": 0.023703099525258176, "acc_norm": 0.7369942196531792, "acc_norm_stderr": 0.023703099525258176 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.4100558659217877, "acc_stderr": 0.016449708209026078, "acc_norm": 0.4100558659217877, "acc_norm_stderr": 0.016449708209026078 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.7222222222222222, "acc_stderr": 0.025646863097137897, "acc_norm": 0.7222222222222222, "acc_norm_stderr": 0.025646863097137897 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.7202572347266881, "acc_stderr": 0.02549425935069491, "acc_norm": 0.7202572347266881, "acc_norm_stderr": 0.02549425935069491 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.7561728395061729, "acc_stderr": 0.023891879541959607, "acc_norm": 0.7561728395061729, "acc_norm_stderr": 0.023891879541959607 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.4929078014184397, "acc_stderr": 0.02982449855912901, "acc_norm": 0.4929078014184397, "acc_norm_stderr": 0.02982449855912901 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.4726205997392438, "acc_stderr": 0.012751075788015058, "acc_norm": 0.4726205997392438, "acc_norm_stderr": 0.012751075788015058 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.6764705882352942, "acc_stderr": 0.02841820861940676, "acc_norm": 0.6764705882352942, "acc_norm_stderr": 0.02841820861940676 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6764705882352942, "acc_stderr": 0.018926082916083383, "acc_norm": 0.6764705882352942, "acc_norm_stderr": 0.018926082916083383 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6727272727272727, "acc_stderr": 0.0449429086625209, "acc_norm": 0.6727272727272727, "acc_norm_stderr": 0.0449429086625209 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.726530612244898, "acc_stderr": 0.028535560337128448, "acc_norm": 0.726530612244898, "acc_norm_stderr": 0.028535560337128448 }, "harness|hendrycksTest-sociology|5": { "acc": 0.8407960199004975, "acc_stderr": 0.025870646766169136, "acc_norm": 0.8407960199004975, "acc_norm_stderr": 0.025870646766169136 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.87, "acc_stderr": 0.033799766898963086, "acc_norm": 0.87, "acc_norm_stderr": 0.033799766898963086 }, "harness|hendrycksTest-virology|5": { "acc": 0.5481927710843374, "acc_stderr": 0.03874371556587953, "acc_norm": 0.5481927710843374, "acc_norm_stderr": 0.03874371556587953 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8245614035087719, "acc_stderr": 0.029170885500727665, "acc_norm": 0.8245614035087719, "acc_norm_stderr": 0.029170885500727665 }, "harness|truthfulqa:mc|0": { "mc1": 0.5605875152998776, "mc1_stderr": 0.0173745204825137, "mc2": 0.7185564600293038, "mc2_stderr": 0.014650212605887704 }, "harness|winogrande|5": { "acc": 0.840568271507498, "acc_stderr": 0.010288617479454764 }, "harness|gsm8k|5": { "acc": 0.6929492039423806, "acc_stderr": 0.012705685723131705 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_CultriX__Wernicke-7B-v9
[ "region:us" ]
2024-02-01T18:46:37+00:00
{"pretty_name": "Evaluation run of CultriX/Wernicke-7B-v9", "dataset_summary": "Dataset automatically created during the evaluation run of model [CultriX/Wernicke-7B-v9](https://huggingface.co/CultriX/Wernicke-7B-v9) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_CultriX__Wernicke-7B-v9\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-01T18:44:19.770806](https://huggingface.co/datasets/open-llm-leaderboard/details_CultriX__Wernicke-7B-v9/blob/main/results_2024-02-01T18-44-19.770806.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6546993526205855,\n \"acc_stderr\": 0.032063919741404885,\n \"acc_norm\": 0.6543080706718742,\n \"acc_norm_stderr\": 0.0327327760979397,\n \"mc1\": 0.5605875152998776,\n \"mc1_stderr\": 0.0173745204825137,\n \"mc2\": 0.7185564600293038,\n \"mc2_stderr\": 0.014650212605887704\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.7022184300341296,\n \"acc_stderr\": 0.013363080107244482,\n \"acc_norm\": 0.7244027303754266,\n \"acc_norm_stderr\": 0.01305716965576184\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7065325632344155,\n \"acc_stderr\": 0.004544201359074618,\n \"acc_norm\": 0.885381398127863,\n \"acc_norm_stderr\": 0.003179100565887989\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.04072314811876837,\n \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.04072314811876837\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6973684210526315,\n \"acc_stderr\": 0.03738520676119669,\n \"acc_norm\": 0.6973684210526315,\n \"acc_norm_stderr\": 0.03738520676119669\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.63,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7056603773584905,\n \"acc_stderr\": 0.02804918631569525,\n \"acc_norm\": 0.7056603773584905,\n \"acc_norm_stderr\": 0.02804918631569525\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7708333333333334,\n \"acc_stderr\": 0.03514697467862388,\n \"acc_norm\": 0.7708333333333334,\n \"acc_norm_stderr\": 0.03514697467862388\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6820809248554913,\n \"acc_stderr\": 0.0355068398916558,\n \"acc_norm\": 0.6820809248554913,\n \"acc_norm_stderr\": 0.0355068398916558\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.43137254901960786,\n \"acc_stderr\": 0.04928099597287534,\n \"acc_norm\": 0.43137254901960786,\n \"acc_norm_stderr\": 0.04928099597287534\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.77,\n \"acc_stderr\": 0.042295258468165065,\n \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.042295258468165065\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5702127659574469,\n \"acc_stderr\": 0.03236214467715564,\n \"acc_norm\": 0.5702127659574469,\n \"acc_norm_stderr\": 0.03236214467715564\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.47368421052631576,\n \"acc_stderr\": 0.046970851366478626,\n \"acc_norm\": 0.47368421052631576,\n \"acc_norm_stderr\": 0.046970851366478626\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5517241379310345,\n \"acc_stderr\": 0.04144311810878152,\n \"acc_norm\": 0.5517241379310345,\n \"acc_norm_stderr\": 0.04144311810878152\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.4312169312169312,\n \"acc_stderr\": 0.025506481698138215,\n \"acc_norm\": 0.4312169312169312,\n \"acc_norm_stderr\": 0.025506481698138215\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.47619047619047616,\n \"acc_stderr\": 0.04467062628403273,\n \"acc_norm\": 0.47619047619047616,\n \"acc_norm_stderr\": 0.04467062628403273\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7838709677419354,\n \"acc_stderr\": 0.02341529343356852,\n \"acc_norm\": 0.7838709677419354,\n \"acc_norm_stderr\": 0.02341529343356852\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.49261083743842365,\n \"acc_stderr\": 0.035176035403610084,\n \"acc_norm\": 0.49261083743842365,\n \"acc_norm_stderr\": 0.035176035403610084\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.0328766675860349,\n \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.0328766675860349\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.797979797979798,\n \"acc_stderr\": 0.02860620428922987,\n \"acc_norm\": 0.797979797979798,\n \"acc_norm_stderr\": 0.02860620428922987\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8963730569948186,\n \"acc_stderr\": 0.02199531196364424,\n \"acc_norm\": 0.8963730569948186,\n \"acc_norm_stderr\": 0.02199531196364424\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6717948717948717,\n \"acc_stderr\": 0.023807633198657266,\n \"acc_norm\": 0.6717948717948717,\n \"acc_norm_stderr\": 0.023807633198657266\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.028742040903948485,\n \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.028742040903948485\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6722689075630253,\n \"acc_stderr\": 0.03048991141767323,\n \"acc_norm\": 0.6722689075630253,\n \"acc_norm_stderr\": 0.03048991141767323\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3708609271523179,\n \"acc_stderr\": 0.03943966699183629,\n \"acc_norm\": 0.3708609271523179,\n \"acc_norm_stderr\": 0.03943966699183629\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8458715596330275,\n \"acc_stderr\": 0.015480826865374303,\n \"acc_norm\": 0.8458715596330275,\n \"acc_norm_stderr\": 0.015480826865374303\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5046296296296297,\n \"acc_stderr\": 0.03409825519163572,\n \"acc_norm\": 0.5046296296296297,\n \"acc_norm_stderr\": 0.03409825519163572\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8382352941176471,\n \"acc_stderr\": 0.025845017986926917,\n \"acc_norm\": 0.8382352941176471,\n \"acc_norm_stderr\": 0.025845017986926917\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8059071729957806,\n \"acc_stderr\": 0.025744902532290902,\n \"acc_norm\": 0.8059071729957806,\n \"acc_norm_stderr\": 0.025744902532290902\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8015267175572519,\n \"acc_stderr\": 0.034981493854624714,\n \"acc_norm\": 0.8015267175572519,\n \"acc_norm_stderr\": 0.034981493854624714\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04186091791394607,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04186091791394607\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7607361963190185,\n \"acc_stderr\": 0.0335195387952127,\n \"acc_norm\": 0.7607361963190185,\n \"acc_norm_stderr\": 0.0335195387952127\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.48214285714285715,\n \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.48214285714285715,\n \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7864077669902912,\n \"acc_stderr\": 0.040580420156460344,\n \"acc_norm\": 0.7864077669902912,\n \"acc_norm_stderr\": 0.040580420156460344\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8760683760683761,\n \"acc_stderr\": 0.021586494001281365,\n \"acc_norm\": 0.8760683760683761,\n \"acc_norm_stderr\": 0.021586494001281365\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8288633461047255,\n \"acc_stderr\": 0.013468201614066307,\n \"acc_norm\": 0.8288633461047255,\n \"acc_norm_stderr\": 0.013468201614066307\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7369942196531792,\n \"acc_stderr\": 0.023703099525258176,\n \"acc_norm\": 0.7369942196531792,\n \"acc_norm_stderr\": 0.023703099525258176\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4100558659217877,\n \"acc_stderr\": 0.016449708209026078,\n \"acc_norm\": 0.4100558659217877,\n \"acc_norm_stderr\": 0.016449708209026078\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.025646863097137897,\n \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.025646863097137897\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7202572347266881,\n \"acc_stderr\": 0.02549425935069491,\n \"acc_norm\": 0.7202572347266881,\n \"acc_norm_stderr\": 0.02549425935069491\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7561728395061729,\n \"acc_stderr\": 0.023891879541959607,\n \"acc_norm\": 0.7561728395061729,\n \"acc_norm_stderr\": 0.023891879541959607\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4929078014184397,\n \"acc_stderr\": 0.02982449855912901,\n \"acc_norm\": 0.4929078014184397,\n \"acc_norm_stderr\": 0.02982449855912901\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4726205997392438,\n \"acc_stderr\": 0.012751075788015058,\n \"acc_norm\": 0.4726205997392438,\n \"acc_norm_stderr\": 0.012751075788015058\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.02841820861940676,\n \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.02841820861940676\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.018926082916083383,\n \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.018926082916083383\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.726530612244898,\n \"acc_stderr\": 0.028535560337128448,\n \"acc_norm\": 0.726530612244898,\n \"acc_norm_stderr\": 0.028535560337128448\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n \"acc_stderr\": 0.025870646766169136,\n \"acc_norm\": 0.8407960199004975,\n \"acc_norm_stderr\": 0.025870646766169136\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.87,\n \"acc_stderr\": 0.033799766898963086,\n \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.033799766898963086\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5481927710843374,\n \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.5481927710843374,\n \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8245614035087719,\n \"acc_stderr\": 0.029170885500727665,\n \"acc_norm\": 0.8245614035087719,\n \"acc_norm_stderr\": 0.029170885500727665\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5605875152998776,\n \"mc1_stderr\": 0.0173745204825137,\n \"mc2\": 0.7185564600293038,\n \"mc2_stderr\": 0.014650212605887704\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.840568271507498,\n \"acc_stderr\": 0.010288617479454764\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6929492039423806,\n \"acc_stderr\": 0.012705685723131705\n }\n}\n```", "repo_url": "https://huggingface.co/CultriX/Wernicke-7B-v9", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_01T18_44_19.770806", "path": ["**/details_harness|arc:challenge|25_2024-02-01T18-44-19.770806.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-01T18-44-19.770806.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_01T18_44_19.770806", "path": ["**/details_harness|gsm8k|5_2024-02-01T18-44-19.770806.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-01T18-44-19.770806.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_01T18_44_19.770806", "path": ["**/details_harness|hellaswag|10_2024-02-01T18-44-19.770806.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-01T18-44-19.770806.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_01T18_44_19.770806", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T18-44-19.770806.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-01T18-44-19.770806.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-01T18-44-19.770806.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T18-44-19.770806.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T18-44-19.770806.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-01T18-44-19.770806.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T18-44-19.770806.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T18-44-19.770806.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T18-44-19.770806.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T18-44-19.770806.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-01T18-44-19.770806.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-01T18-44-19.770806.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T18-44-19.770806.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-01T18-44-19.770806.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T18-44-19.770806.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T18-44-19.770806.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T18-44-19.770806.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-01T18-44-19.770806.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T18-44-19.770806.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T18-44-19.770806.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T18-44-19.770806.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T18-44-19.770806.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T18-44-19.770806.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T18-44-19.770806.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T18-44-19.770806.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T18-44-19.770806.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T18-44-19.770806.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T18-44-19.770806.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T18-44-19.770806.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T18-44-19.770806.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T18-44-19.770806.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T18-44-19.770806.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-01T18-44-19.770806.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T18-44-19.770806.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-01T18-44-19.770806.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T18-44-19.770806.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T18-44-19.770806.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T18-44-19.770806.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-01T18-44-19.770806.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-01T18-44-19.770806.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T18-44-19.770806.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T18-44-19.770806.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T18-44-19.770806.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T18-44-19.770806.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-01T18-44-19.770806.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-01T18-44-19.770806.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-01T18-44-19.770806.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T18-44-19.770806.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-01T18-44-19.770806.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T18-44-19.770806.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T18-44-19.770806.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-01T18-44-19.770806.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-01T18-44-19.770806.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-01T18-44-19.770806.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T18-44-19.770806.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-01T18-44-19.770806.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-01T18-44-19.770806.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T18-44-19.770806.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-01T18-44-19.770806.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-01T18-44-19.770806.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T18-44-19.770806.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T18-44-19.770806.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-01T18-44-19.770806.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T18-44-19.770806.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T18-44-19.770806.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T18-44-19.770806.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T18-44-19.770806.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-01T18-44-19.770806.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-01T18-44-19.770806.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T18-44-19.770806.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-01T18-44-19.770806.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T18-44-19.770806.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T18-44-19.770806.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T18-44-19.770806.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-01T18-44-19.770806.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T18-44-19.770806.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T18-44-19.770806.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T18-44-19.770806.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T18-44-19.770806.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T18-44-19.770806.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T18-44-19.770806.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T18-44-19.770806.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T18-44-19.770806.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T18-44-19.770806.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T18-44-19.770806.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T18-44-19.770806.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T18-44-19.770806.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T18-44-19.770806.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T18-44-19.770806.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-01T18-44-19.770806.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T18-44-19.770806.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-01T18-44-19.770806.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T18-44-19.770806.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T18-44-19.770806.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T18-44-19.770806.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-01T18-44-19.770806.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-01T18-44-19.770806.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T18-44-19.770806.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T18-44-19.770806.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T18-44-19.770806.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T18-44-19.770806.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-01T18-44-19.770806.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-01T18-44-19.770806.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-01T18-44-19.770806.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T18-44-19.770806.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-01T18-44-19.770806.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T18-44-19.770806.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T18-44-19.770806.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-01T18-44-19.770806.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-01T18-44-19.770806.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-01T18-44-19.770806.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T18-44-19.770806.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-01T18-44-19.770806.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-01T18-44-19.770806.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_01T18_44_19.770806", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T18-44-19.770806.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T18-44-19.770806.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_01T18_44_19.770806", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-01T18-44-19.770806.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-01T18-44-19.770806.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_01T18_44_19.770806", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-01T18-44-19.770806.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-01T18-44-19.770806.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_01T18_44_19.770806", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T18-44-19.770806.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T18-44-19.770806.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_01T18_44_19.770806", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T18-44-19.770806.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T18-44-19.770806.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_01T18_44_19.770806", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-01T18-44-19.770806.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-01T18-44-19.770806.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_01T18_44_19.770806", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T18-44-19.770806.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T18-44-19.770806.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_01T18_44_19.770806", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T18-44-19.770806.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T18-44-19.770806.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_01T18_44_19.770806", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T18-44-19.770806.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T18-44-19.770806.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_01T18_44_19.770806", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T18-44-19.770806.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T18-44-19.770806.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_01T18_44_19.770806", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-01T18-44-19.770806.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-01T18-44-19.770806.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_01T18_44_19.770806", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-01T18-44-19.770806.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-01T18-44-19.770806.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_01T18_44_19.770806", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T18-44-19.770806.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T18-44-19.770806.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_01T18_44_19.770806", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-01T18-44-19.770806.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-01T18-44-19.770806.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_01T18_44_19.770806", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T18-44-19.770806.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T18-44-19.770806.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_01T18_44_19.770806", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T18-44-19.770806.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T18-44-19.770806.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_01T18_44_19.770806", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T18-44-19.770806.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T18-44-19.770806.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_01T18_44_19.770806", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-01T18-44-19.770806.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-01T18-44-19.770806.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_01T18_44_19.770806", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T18-44-19.770806.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T18-44-19.770806.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_01T18_44_19.770806", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T18-44-19.770806.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T18-44-19.770806.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_01T18_44_19.770806", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T18-44-19.770806.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T18-44-19.770806.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_01T18_44_19.770806", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T18-44-19.770806.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T18-44-19.770806.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_01T18_44_19.770806", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T18-44-19.770806.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T18-44-19.770806.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_01T18_44_19.770806", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T18-44-19.770806.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T18-44-19.770806.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_01T18_44_19.770806", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T18-44-19.770806.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T18-44-19.770806.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_01T18_44_19.770806", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T18-44-19.770806.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T18-44-19.770806.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_01T18_44_19.770806", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T18-44-19.770806.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T18-44-19.770806.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_01T18_44_19.770806", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T18-44-19.770806.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T18-44-19.770806.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_01T18_44_19.770806", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T18-44-19.770806.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T18-44-19.770806.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_01T18_44_19.770806", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T18-44-19.770806.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T18-44-19.770806.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_01T18_44_19.770806", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T18-44-19.770806.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T18-44-19.770806.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_01T18_44_19.770806", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T18-44-19.770806.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T18-44-19.770806.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_01T18_44_19.770806", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-01T18-44-19.770806.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-01T18-44-19.770806.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_01T18_44_19.770806", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T18-44-19.770806.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T18-44-19.770806.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_01T18_44_19.770806", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-01T18-44-19.770806.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-01T18-44-19.770806.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_01T18_44_19.770806", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T18-44-19.770806.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T18-44-19.770806.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_01T18_44_19.770806", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T18-44-19.770806.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T18-44-19.770806.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_01T18_44_19.770806", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T18-44-19.770806.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T18-44-19.770806.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_01T18_44_19.770806", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-01T18-44-19.770806.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-01T18-44-19.770806.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_01T18_44_19.770806", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-01T18-44-19.770806.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-01T18-44-19.770806.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_01T18_44_19.770806", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T18-44-19.770806.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T18-44-19.770806.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_01T18_44_19.770806", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T18-44-19.770806.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T18-44-19.770806.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_01T18_44_19.770806", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T18-44-19.770806.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T18-44-19.770806.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_01T18_44_19.770806", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T18-44-19.770806.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T18-44-19.770806.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_01T18_44_19.770806", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-01T18-44-19.770806.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-01T18-44-19.770806.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_01T18_44_19.770806", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-01T18-44-19.770806.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-01T18-44-19.770806.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_01T18_44_19.770806", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-01T18-44-19.770806.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-01T18-44-19.770806.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_01T18_44_19.770806", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T18-44-19.770806.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T18-44-19.770806.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_01T18_44_19.770806", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-01T18-44-19.770806.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-01T18-44-19.770806.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_01T18_44_19.770806", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T18-44-19.770806.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T18-44-19.770806.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_01T18_44_19.770806", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T18-44-19.770806.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T18-44-19.770806.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_01T18_44_19.770806", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-01T18-44-19.770806.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-01T18-44-19.770806.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_01T18_44_19.770806", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-01T18-44-19.770806.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-01T18-44-19.770806.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_01T18_44_19.770806", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-01T18-44-19.770806.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-01T18-44-19.770806.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_01T18_44_19.770806", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T18-44-19.770806.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T18-44-19.770806.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_01T18_44_19.770806", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-01T18-44-19.770806.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-01T18-44-19.770806.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_01T18_44_19.770806", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-01T18-44-19.770806.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-01T18-44-19.770806.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_01T18_44_19.770806", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-01T18-44-19.770806.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-01T18-44-19.770806.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_01T18_44_19.770806", "path": ["**/details_harness|winogrande|5_2024-02-01T18-44-19.770806.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-01T18-44-19.770806.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_01T18_44_19.770806", "path": ["results_2024-02-01T18-44-19.770806.parquet"]}, {"split": "latest", "path": ["results_2024-02-01T18-44-19.770806.parquet"]}]}]}
2024-02-01T18:47:05+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of CultriX/Wernicke-7B-v9 Dataset automatically created during the evaluation run of model CultriX/Wernicke-7B-v9 on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-02-01T18:44:19.770806(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of CultriX/Wernicke-7B-v9\n\n\n\nDataset automatically created during the evaluation run of model CultriX/Wernicke-7B-v9 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-02-01T18:44:19.770806(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of CultriX/Wernicke-7B-v9\n\n\n\nDataset automatically created during the evaluation run of model CultriX/Wernicke-7B-v9 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-02-01T18:44:19.770806(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
57918c705008316f06ac409c7a4e39747f6ea302
# Dataset Card for Evaluation run of marcel/phi-2-openhermes-30k <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [marcel/phi-2-openhermes-30k](https://huggingface.co/marcel/phi-2-openhermes-30k) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_marcel__phi-2-openhermes-30k", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-02-01T18:54:53.186103](https://huggingface.co/datasets/open-llm-leaderboard/details_marcel__phi-2-openhermes-30k/blob/main/results_2024-02-01T18-54-53.186103.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.5732804491341541, "acc_stderr": 0.03371240537561013, "acc_norm": 0.5752830563376676, "acc_norm_stderr": 0.034401894257503625, "mc1": 0.31456548347613217, "mc1_stderr": 0.016255241993179178, "mc2": 0.45379798730359744, "mc2_stderr": 0.015158432314849521 }, "harness|arc:challenge|25": { "acc": 0.5767918088737202, "acc_stderr": 0.014438036220848025, "acc_norm": 0.6100682593856656, "acc_norm_stderr": 0.014252959848892896 }, "harness|hellaswag|10": { "acc": 0.569308902609042, "acc_stderr": 0.004941609820763585, "acc_norm": 0.7471619199362677, "acc_norm_stderr": 0.004337506344899918 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.23, "acc_stderr": 0.04229525846816506, "acc_norm": 0.23, "acc_norm_stderr": 0.04229525846816506 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.43703703703703706, "acc_stderr": 0.04284958639753399, "acc_norm": 0.43703703703703706, "acc_norm_stderr": 0.04284958639753399 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.6118421052631579, "acc_stderr": 0.03965842097512744, "acc_norm": 0.6118421052631579, "acc_norm_stderr": 0.03965842097512744 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.52, "acc_stderr": 0.050211673156867795, "acc_norm": 0.52, "acc_norm_stderr": 0.050211673156867795 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.6452830188679245, "acc_stderr": 0.029445175328199586, "acc_norm": 0.6452830188679245, "acc_norm_stderr": 0.029445175328199586 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.6527777777777778, "acc_stderr": 0.039812405437178615, "acc_norm": 0.6527777777777778, "acc_norm_stderr": 0.039812405437178615 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.38, "acc_stderr": 0.04878317312145633, "acc_norm": 0.38, "acc_norm_stderr": 0.04878317312145633 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.38, "acc_stderr": 0.048783173121456316, "acc_norm": 0.38, "acc_norm_stderr": 0.048783173121456316 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.39, "acc_stderr": 0.04902071300001975, "acc_norm": 0.39, "acc_norm_stderr": 0.04902071300001975 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.5549132947976878, "acc_stderr": 0.03789401760283647, "acc_norm": 0.5549132947976878, "acc_norm_stderr": 0.03789401760283647 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.37254901960784315, "acc_stderr": 0.04810840148082633, "acc_norm": 0.37254901960784315, "acc_norm_stderr": 0.04810840148082633 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.71, "acc_stderr": 0.045604802157206845, "acc_norm": 0.71, "acc_norm_stderr": 0.045604802157206845 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.4978723404255319, "acc_stderr": 0.032685726586674915, "acc_norm": 0.4978723404255319, "acc_norm_stderr": 0.032685726586674915 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.34210526315789475, "acc_stderr": 0.04462917535336936, "acc_norm": 0.34210526315789475, "acc_norm_stderr": 0.04462917535336936 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5517241379310345, "acc_stderr": 0.04144311810878151, "acc_norm": 0.5517241379310345, "acc_norm_stderr": 0.04144311810878151 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.46825396825396826, "acc_stderr": 0.025699352832131796, "acc_norm": 0.46825396825396826, "acc_norm_stderr": 0.025699352832131796 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.3968253968253968, "acc_stderr": 0.04375888492727061, "acc_norm": 0.3968253968253968, "acc_norm_stderr": 0.04375888492727061 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.35, "acc_stderr": 0.0479372485441102, "acc_norm": 0.35, "acc_norm_stderr": 0.0479372485441102 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.6838709677419355, "acc_stderr": 0.026450874489042757, "acc_norm": 0.6838709677419355, "acc_norm_stderr": 0.026450874489042757 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.4876847290640394, "acc_stderr": 0.035169204442208966, "acc_norm": 0.4876847290640394, "acc_norm_stderr": 0.035169204442208966 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.62, "acc_stderr": 0.04878317312145632, "acc_norm": 0.62, "acc_norm_stderr": 0.04878317312145632 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.6424242424242425, "acc_stderr": 0.03742597043806585, "acc_norm": 0.6424242424242425, "acc_norm_stderr": 0.03742597043806585 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.702020202020202, "acc_stderr": 0.03258630383836556, "acc_norm": 0.702020202020202, "acc_norm_stderr": 0.03258630383836556 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.8031088082901554, "acc_stderr": 0.028697873971860677, "acc_norm": 0.8031088082901554, "acc_norm_stderr": 0.028697873971860677 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.5974358974358974, "acc_stderr": 0.024864995159767755, "acc_norm": 0.5974358974358974, "acc_norm_stderr": 0.024864995159767755 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.31851851851851853, "acc_stderr": 0.028406533090608466, "acc_norm": 0.31851851851851853, "acc_norm_stderr": 0.028406533090608466 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.592436974789916, "acc_stderr": 0.031918633744784645, "acc_norm": 0.592436974789916, "acc_norm_stderr": 0.031918633744784645 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.37748344370860926, "acc_stderr": 0.0395802723112157, "acc_norm": 0.37748344370860926, "acc_norm_stderr": 0.0395802723112157 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8146788990825689, "acc_stderr": 0.016659279700295838, "acc_norm": 0.8146788990825689, "acc_norm_stderr": 0.016659279700295838 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.4305555555555556, "acc_stderr": 0.03376922151252336, "acc_norm": 0.4305555555555556, "acc_norm_stderr": 0.03376922151252336 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.6470588235294118, "acc_stderr": 0.03354092437591518, "acc_norm": 0.6470588235294118, "acc_norm_stderr": 0.03354092437591518 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.7426160337552743, "acc_stderr": 0.028458820991460288, "acc_norm": 0.7426160337552743, "acc_norm_stderr": 0.028458820991460288 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6278026905829597, "acc_stderr": 0.03244305283008731, "acc_norm": 0.6278026905829597, "acc_norm_stderr": 0.03244305283008731 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.6946564885496184, "acc_stderr": 0.040393149787245605, "acc_norm": 0.6946564885496184, "acc_norm_stderr": 0.040393149787245605 }, "harness|hendrycksTest-international_law|5": { "acc": 0.7520661157024794, "acc_stderr": 0.03941897526516302, "acc_norm": 0.7520661157024794, "acc_norm_stderr": 0.03941897526516302 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7314814814814815, "acc_stderr": 0.042844679680521934, "acc_norm": 0.7314814814814815, "acc_norm_stderr": 0.042844679680521934 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7116564417177914, "acc_stderr": 0.03559039531617342, "acc_norm": 0.7116564417177914, "acc_norm_stderr": 0.03559039531617342 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.5089285714285714, "acc_stderr": 0.04745033255489123, "acc_norm": 0.5089285714285714, "acc_norm_stderr": 0.04745033255489123 }, "harness|hendrycksTest-management|5": { "acc": 0.7378640776699029, "acc_stderr": 0.04354631077260595, "acc_norm": 0.7378640776699029, "acc_norm_stderr": 0.04354631077260595 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8076923076923077, "acc_stderr": 0.02581923325648371, "acc_norm": 0.8076923076923077, "acc_norm_stderr": 0.02581923325648371 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.59, "acc_stderr": 0.049431107042371025, "acc_norm": 0.59, "acc_norm_stderr": 0.049431107042371025 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.6768837803320562, "acc_stderr": 0.016723726512343048, "acc_norm": 0.6768837803320562, "acc_norm_stderr": 0.016723726512343048 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.630057803468208, "acc_stderr": 0.025992472029306397, "acc_norm": 0.630057803468208, "acc_norm_stderr": 0.025992472029306397 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.24692737430167597, "acc_stderr": 0.01442229220480886, "acc_norm": 0.24692737430167597, "acc_norm_stderr": 0.01442229220480886 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.6339869281045751, "acc_stderr": 0.027582811415159614, "acc_norm": 0.6339869281045751, "acc_norm_stderr": 0.027582811415159614 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.639871382636656, "acc_stderr": 0.02726429759980402, "acc_norm": 0.639871382636656, "acc_norm_stderr": 0.02726429759980402 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.6111111111111112, "acc_stderr": 0.027125115513166848, "acc_norm": 0.6111111111111112, "acc_norm_stderr": 0.027125115513166848 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.40070921985815605, "acc_stderr": 0.02923346574557308, "acc_norm": 0.40070921985815605, "acc_norm_stderr": 0.02923346574557308 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.41460234680573665, "acc_stderr": 0.012582597058908284, "acc_norm": 0.41460234680573665, "acc_norm_stderr": 0.012582597058908284 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.5073529411764706, "acc_stderr": 0.030369552523902173, "acc_norm": 0.5073529411764706, "acc_norm_stderr": 0.030369552523902173 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.5588235294117647, "acc_stderr": 0.02008736207670285, "acc_norm": 0.5588235294117647, "acc_norm_stderr": 0.02008736207670285 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.7, "acc_stderr": 0.04389311454644287, "acc_norm": 0.7, "acc_norm_stderr": 0.04389311454644287 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.710204081632653, "acc_stderr": 0.029043088683304328, "acc_norm": 0.710204081632653, "acc_norm_stderr": 0.029043088683304328 }, "harness|hendrycksTest-sociology|5": { "acc": 0.8159203980099502, "acc_stderr": 0.027403859410786865, "acc_norm": 0.8159203980099502, "acc_norm_stderr": 0.027403859410786865 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.77, "acc_stderr": 0.04229525846816508, "acc_norm": 0.77, "acc_norm_stderr": 0.04229525846816508 }, "harness|hendrycksTest-virology|5": { "acc": 0.4939759036144578, "acc_stderr": 0.038922121953330446, "acc_norm": 0.4939759036144578, "acc_norm_stderr": 0.038922121953330446 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.6608187134502924, "acc_stderr": 0.03631053496488904, "acc_norm": 0.6608187134502924, "acc_norm_stderr": 0.03631053496488904 }, "harness|truthfulqa:mc|0": { "mc1": 0.31456548347613217, "mc1_stderr": 0.016255241993179178, "mc2": 0.45379798730359744, "mc2_stderr": 0.015158432314849521 }, "harness|winogrande|5": { "acc": 0.7490134175217048, "acc_stderr": 0.012185776220516146 }, "harness|gsm8k|5": { "acc": 0.49052312357846856, "acc_stderr": 0.013770010651168823 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_marcel__phi-2-openhermes-30k
[ "region:us" ]
2024-02-01T18:56:34+00:00
{"pretty_name": "Evaluation run of marcel/phi-2-openhermes-30k", "dataset_summary": "Dataset automatically created during the evaluation run of model [marcel/phi-2-openhermes-30k](https://huggingface.co/marcel/phi-2-openhermes-30k) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_marcel__phi-2-openhermes-30k\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-01T18:54:53.186103](https://huggingface.co/datasets/open-llm-leaderboard/details_marcel__phi-2-openhermes-30k/blob/main/results_2024-02-01T18-54-53.186103.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5732804491341541,\n \"acc_stderr\": 0.03371240537561013,\n \"acc_norm\": 0.5752830563376676,\n \"acc_norm_stderr\": 0.034401894257503625,\n \"mc1\": 0.31456548347613217,\n \"mc1_stderr\": 0.016255241993179178,\n \"mc2\": 0.45379798730359744,\n \"mc2_stderr\": 0.015158432314849521\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5767918088737202,\n \"acc_stderr\": 0.014438036220848025,\n \"acc_norm\": 0.6100682593856656,\n \"acc_norm_stderr\": 0.014252959848892896\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.569308902609042,\n \"acc_stderr\": 0.004941609820763585,\n \"acc_norm\": 0.7471619199362677,\n \"acc_norm_stderr\": 0.004337506344899918\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816506,\n \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816506\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.43703703703703706,\n \"acc_stderr\": 0.04284958639753399,\n \"acc_norm\": 0.43703703703703706,\n \"acc_norm_stderr\": 0.04284958639753399\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6118421052631579,\n \"acc_stderr\": 0.03965842097512744,\n \"acc_norm\": 0.6118421052631579,\n \"acc_norm_stderr\": 0.03965842097512744\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6452830188679245,\n \"acc_stderr\": 0.029445175328199586,\n \"acc_norm\": 0.6452830188679245,\n \"acc_norm_stderr\": 0.029445175328199586\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6527777777777778,\n \"acc_stderr\": 0.039812405437178615,\n \"acc_norm\": 0.6527777777777778,\n \"acc_norm_stderr\": 0.039812405437178615\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145633,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145633\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5549132947976878,\n \"acc_stderr\": 0.03789401760283647,\n \"acc_norm\": 0.5549132947976878,\n \"acc_norm_stderr\": 0.03789401760283647\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.37254901960784315,\n \"acc_stderr\": 0.04810840148082633,\n \"acc_norm\": 0.37254901960784315,\n \"acc_norm_stderr\": 0.04810840148082633\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.4978723404255319,\n \"acc_stderr\": 0.032685726586674915,\n \"acc_norm\": 0.4978723404255319,\n \"acc_norm_stderr\": 0.032685726586674915\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.34210526315789475,\n \"acc_stderr\": 0.04462917535336936,\n \"acc_norm\": 0.34210526315789475,\n \"acc_norm_stderr\": 0.04462917535336936\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5517241379310345,\n \"acc_stderr\": 0.04144311810878151,\n \"acc_norm\": 0.5517241379310345,\n \"acc_norm_stderr\": 0.04144311810878151\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.46825396825396826,\n \"acc_stderr\": 0.025699352832131796,\n \"acc_norm\": 0.46825396825396826,\n \"acc_norm_stderr\": 0.025699352832131796\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3968253968253968,\n \"acc_stderr\": 0.04375888492727061,\n \"acc_norm\": 0.3968253968253968,\n \"acc_norm_stderr\": 0.04375888492727061\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6838709677419355,\n \"acc_stderr\": 0.026450874489042757,\n \"acc_norm\": 0.6838709677419355,\n \"acc_norm_stderr\": 0.026450874489042757\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4876847290640394,\n \"acc_stderr\": 0.035169204442208966,\n \"acc_norm\": 0.4876847290640394,\n \"acc_norm_stderr\": 0.035169204442208966\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.62,\n \"acc_stderr\": 0.04878317312145632,\n \"acc_norm\": 0.62,\n \"acc_norm_stderr\": 0.04878317312145632\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.6424242424242425,\n \"acc_stderr\": 0.03742597043806585,\n \"acc_norm\": 0.6424242424242425,\n \"acc_norm_stderr\": 0.03742597043806585\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.702020202020202,\n \"acc_stderr\": 0.03258630383836556,\n \"acc_norm\": 0.702020202020202,\n \"acc_norm_stderr\": 0.03258630383836556\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8031088082901554,\n \"acc_stderr\": 0.028697873971860677,\n \"acc_norm\": 0.8031088082901554,\n \"acc_norm_stderr\": 0.028697873971860677\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.5974358974358974,\n \"acc_stderr\": 0.024864995159767755,\n \"acc_norm\": 0.5974358974358974,\n \"acc_norm_stderr\": 0.024864995159767755\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.31851851851851853,\n \"acc_stderr\": 0.028406533090608466,\n \"acc_norm\": 0.31851851851851853,\n \"acc_norm_stderr\": 0.028406533090608466\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.592436974789916,\n \"acc_stderr\": 0.031918633744784645,\n \"acc_norm\": 0.592436974789916,\n \"acc_norm_stderr\": 0.031918633744784645\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.37748344370860926,\n \"acc_stderr\": 0.0395802723112157,\n \"acc_norm\": 0.37748344370860926,\n \"acc_norm_stderr\": 0.0395802723112157\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8146788990825689,\n \"acc_stderr\": 0.016659279700295838,\n \"acc_norm\": 0.8146788990825689,\n \"acc_norm_stderr\": 0.016659279700295838\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4305555555555556,\n \"acc_stderr\": 0.03376922151252336,\n \"acc_norm\": 0.4305555555555556,\n \"acc_norm_stderr\": 0.03376922151252336\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.6470588235294118,\n \"acc_stderr\": 0.03354092437591518,\n \"acc_norm\": 0.6470588235294118,\n \"acc_norm_stderr\": 0.03354092437591518\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7426160337552743,\n \"acc_stderr\": 0.028458820991460288,\n \"acc_norm\": 0.7426160337552743,\n \"acc_norm_stderr\": 0.028458820991460288\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6278026905829597,\n \"acc_stderr\": 0.03244305283008731,\n \"acc_norm\": 0.6278026905829597,\n \"acc_norm_stderr\": 0.03244305283008731\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.6946564885496184,\n \"acc_stderr\": 0.040393149787245605,\n \"acc_norm\": 0.6946564885496184,\n \"acc_norm_stderr\": 0.040393149787245605\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7520661157024794,\n \"acc_stderr\": 0.03941897526516302,\n \"acc_norm\": 0.7520661157024794,\n \"acc_norm_stderr\": 0.03941897526516302\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7314814814814815,\n \"acc_stderr\": 0.042844679680521934,\n \"acc_norm\": 0.7314814814814815,\n \"acc_norm_stderr\": 0.042844679680521934\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7116564417177914,\n \"acc_stderr\": 0.03559039531617342,\n \"acc_norm\": 0.7116564417177914,\n \"acc_norm_stderr\": 0.03559039531617342\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5089285714285714,\n \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.5089285714285714,\n \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7378640776699029,\n \"acc_stderr\": 0.04354631077260595,\n \"acc_norm\": 0.7378640776699029,\n \"acc_norm_stderr\": 0.04354631077260595\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8076923076923077,\n \"acc_stderr\": 0.02581923325648371,\n \"acc_norm\": 0.8076923076923077,\n \"acc_norm_stderr\": 0.02581923325648371\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.59,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.59,\n \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6768837803320562,\n \"acc_stderr\": 0.016723726512343048,\n \"acc_norm\": 0.6768837803320562,\n \"acc_norm_stderr\": 0.016723726512343048\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.630057803468208,\n \"acc_stderr\": 0.025992472029306397,\n \"acc_norm\": 0.630057803468208,\n \"acc_norm_stderr\": 0.025992472029306397\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24692737430167597,\n \"acc_stderr\": 0.01442229220480886,\n \"acc_norm\": 0.24692737430167597,\n \"acc_norm_stderr\": 0.01442229220480886\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6339869281045751,\n \"acc_stderr\": 0.027582811415159614,\n \"acc_norm\": 0.6339869281045751,\n \"acc_norm_stderr\": 0.027582811415159614\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.639871382636656,\n \"acc_stderr\": 0.02726429759980402,\n \"acc_norm\": 0.639871382636656,\n \"acc_norm_stderr\": 0.02726429759980402\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.6111111111111112,\n \"acc_stderr\": 0.027125115513166848,\n \"acc_norm\": 0.6111111111111112,\n \"acc_norm_stderr\": 0.027125115513166848\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.40070921985815605,\n \"acc_stderr\": 0.02923346574557308,\n \"acc_norm\": 0.40070921985815605,\n \"acc_norm_stderr\": 0.02923346574557308\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.41460234680573665,\n \"acc_stderr\": 0.012582597058908284,\n \"acc_norm\": 0.41460234680573665,\n \"acc_norm_stderr\": 0.012582597058908284\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.5073529411764706,\n \"acc_stderr\": 0.030369552523902173,\n \"acc_norm\": 0.5073529411764706,\n \"acc_norm_stderr\": 0.030369552523902173\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.5588235294117647,\n \"acc_stderr\": 0.02008736207670285,\n \"acc_norm\": 0.5588235294117647,\n \"acc_norm_stderr\": 0.02008736207670285\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.04389311454644287,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.04389311454644287\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.710204081632653,\n \"acc_stderr\": 0.029043088683304328,\n \"acc_norm\": 0.710204081632653,\n \"acc_norm_stderr\": 0.029043088683304328\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8159203980099502,\n \"acc_stderr\": 0.027403859410786865,\n \"acc_norm\": 0.8159203980099502,\n \"acc_norm_stderr\": 0.027403859410786865\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.77,\n \"acc_stderr\": 0.04229525846816508,\n \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.04229525846816508\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4939759036144578,\n \"acc_stderr\": 0.038922121953330446,\n \"acc_norm\": 0.4939759036144578,\n \"acc_norm_stderr\": 0.038922121953330446\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.6608187134502924,\n \"acc_stderr\": 0.03631053496488904,\n \"acc_norm\": 0.6608187134502924,\n \"acc_norm_stderr\": 0.03631053496488904\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.31456548347613217,\n \"mc1_stderr\": 0.016255241993179178,\n \"mc2\": 0.45379798730359744,\n \"mc2_stderr\": 0.015158432314849521\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7490134175217048,\n \"acc_stderr\": 0.012185776220516146\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.49052312357846856,\n \"acc_stderr\": 0.013770010651168823\n }\n}\n```", "repo_url": "https://huggingface.co/marcel/phi-2-openhermes-30k", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_01T18_54_53.186103", "path": ["**/details_harness|arc:challenge|25_2024-02-01T18-54-53.186103.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-01T18-54-53.186103.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_01T18_54_53.186103", "path": ["**/details_harness|gsm8k|5_2024-02-01T18-54-53.186103.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-01T18-54-53.186103.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_01T18_54_53.186103", "path": ["**/details_harness|hellaswag|10_2024-02-01T18-54-53.186103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-01T18-54-53.186103.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_01T18_54_53.186103", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T18-54-53.186103.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-01T18-54-53.186103.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-01T18-54-53.186103.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T18-54-53.186103.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T18-54-53.186103.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-01T18-54-53.186103.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T18-54-53.186103.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T18-54-53.186103.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T18-54-53.186103.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T18-54-53.186103.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-01T18-54-53.186103.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-01T18-54-53.186103.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T18-54-53.186103.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-01T18-54-53.186103.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T18-54-53.186103.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T18-54-53.186103.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T18-54-53.186103.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-01T18-54-53.186103.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T18-54-53.186103.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T18-54-53.186103.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T18-54-53.186103.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T18-54-53.186103.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T18-54-53.186103.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T18-54-53.186103.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T18-54-53.186103.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T18-54-53.186103.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T18-54-53.186103.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T18-54-53.186103.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T18-54-53.186103.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T18-54-53.186103.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T18-54-53.186103.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T18-54-53.186103.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-01T18-54-53.186103.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T18-54-53.186103.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-01T18-54-53.186103.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T18-54-53.186103.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T18-54-53.186103.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T18-54-53.186103.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-01T18-54-53.186103.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-01T18-54-53.186103.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T18-54-53.186103.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T18-54-53.186103.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T18-54-53.186103.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T18-54-53.186103.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-01T18-54-53.186103.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-01T18-54-53.186103.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-01T18-54-53.186103.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T18-54-53.186103.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-01T18-54-53.186103.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T18-54-53.186103.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T18-54-53.186103.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-01T18-54-53.186103.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-01T18-54-53.186103.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-01T18-54-53.186103.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T18-54-53.186103.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-01T18-54-53.186103.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-01T18-54-53.186103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T18-54-53.186103.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-01T18-54-53.186103.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-01T18-54-53.186103.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T18-54-53.186103.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T18-54-53.186103.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-01T18-54-53.186103.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T18-54-53.186103.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T18-54-53.186103.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T18-54-53.186103.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T18-54-53.186103.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-01T18-54-53.186103.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-01T18-54-53.186103.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T18-54-53.186103.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-01T18-54-53.186103.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T18-54-53.186103.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T18-54-53.186103.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T18-54-53.186103.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-01T18-54-53.186103.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T18-54-53.186103.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T18-54-53.186103.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T18-54-53.186103.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T18-54-53.186103.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T18-54-53.186103.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T18-54-53.186103.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T18-54-53.186103.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T18-54-53.186103.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T18-54-53.186103.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T18-54-53.186103.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T18-54-53.186103.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T18-54-53.186103.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T18-54-53.186103.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T18-54-53.186103.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-01T18-54-53.186103.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T18-54-53.186103.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-01T18-54-53.186103.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T18-54-53.186103.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T18-54-53.186103.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T18-54-53.186103.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-01T18-54-53.186103.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-01T18-54-53.186103.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T18-54-53.186103.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T18-54-53.186103.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T18-54-53.186103.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T18-54-53.186103.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-01T18-54-53.186103.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-01T18-54-53.186103.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-01T18-54-53.186103.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T18-54-53.186103.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-01T18-54-53.186103.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T18-54-53.186103.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T18-54-53.186103.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-01T18-54-53.186103.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-01T18-54-53.186103.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-01T18-54-53.186103.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T18-54-53.186103.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-01T18-54-53.186103.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-01T18-54-53.186103.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_01T18_54_53.186103", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T18-54-53.186103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T18-54-53.186103.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_01T18_54_53.186103", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-01T18-54-53.186103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-01T18-54-53.186103.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_01T18_54_53.186103", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-01T18-54-53.186103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-01T18-54-53.186103.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_01T18_54_53.186103", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T18-54-53.186103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T18-54-53.186103.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_01T18_54_53.186103", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T18-54-53.186103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T18-54-53.186103.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_01T18_54_53.186103", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-01T18-54-53.186103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-01T18-54-53.186103.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_01T18_54_53.186103", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T18-54-53.186103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T18-54-53.186103.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_01T18_54_53.186103", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T18-54-53.186103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T18-54-53.186103.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_01T18_54_53.186103", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T18-54-53.186103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T18-54-53.186103.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_01T18_54_53.186103", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T18-54-53.186103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T18-54-53.186103.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_01T18_54_53.186103", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-01T18-54-53.186103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-01T18-54-53.186103.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_01T18_54_53.186103", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-01T18-54-53.186103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-01T18-54-53.186103.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_01T18_54_53.186103", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T18-54-53.186103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T18-54-53.186103.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_01T18_54_53.186103", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-01T18-54-53.186103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-01T18-54-53.186103.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_01T18_54_53.186103", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T18-54-53.186103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T18-54-53.186103.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_01T18_54_53.186103", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T18-54-53.186103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T18-54-53.186103.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_01T18_54_53.186103", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T18-54-53.186103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T18-54-53.186103.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_01T18_54_53.186103", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-01T18-54-53.186103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-01T18-54-53.186103.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_01T18_54_53.186103", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T18-54-53.186103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T18-54-53.186103.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_01T18_54_53.186103", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T18-54-53.186103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T18-54-53.186103.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_01T18_54_53.186103", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T18-54-53.186103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T18-54-53.186103.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_01T18_54_53.186103", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T18-54-53.186103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T18-54-53.186103.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_01T18_54_53.186103", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T18-54-53.186103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T18-54-53.186103.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_01T18_54_53.186103", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T18-54-53.186103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T18-54-53.186103.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_01T18_54_53.186103", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T18-54-53.186103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T18-54-53.186103.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_01T18_54_53.186103", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T18-54-53.186103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T18-54-53.186103.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_01T18_54_53.186103", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T18-54-53.186103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T18-54-53.186103.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_01T18_54_53.186103", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T18-54-53.186103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T18-54-53.186103.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_01T18_54_53.186103", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T18-54-53.186103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T18-54-53.186103.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_01T18_54_53.186103", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T18-54-53.186103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T18-54-53.186103.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_01T18_54_53.186103", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T18-54-53.186103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T18-54-53.186103.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_01T18_54_53.186103", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T18-54-53.186103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T18-54-53.186103.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_01T18_54_53.186103", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-01T18-54-53.186103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-01T18-54-53.186103.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_01T18_54_53.186103", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T18-54-53.186103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T18-54-53.186103.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_01T18_54_53.186103", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-01T18-54-53.186103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-01T18-54-53.186103.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_01T18_54_53.186103", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T18-54-53.186103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T18-54-53.186103.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_01T18_54_53.186103", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T18-54-53.186103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T18-54-53.186103.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_01T18_54_53.186103", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T18-54-53.186103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T18-54-53.186103.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_01T18_54_53.186103", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-01T18-54-53.186103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-01T18-54-53.186103.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_01T18_54_53.186103", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-01T18-54-53.186103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-01T18-54-53.186103.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_01T18_54_53.186103", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T18-54-53.186103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T18-54-53.186103.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_01T18_54_53.186103", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T18-54-53.186103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T18-54-53.186103.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_01T18_54_53.186103", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T18-54-53.186103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T18-54-53.186103.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_01T18_54_53.186103", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T18-54-53.186103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T18-54-53.186103.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_01T18_54_53.186103", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-01T18-54-53.186103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-01T18-54-53.186103.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_01T18_54_53.186103", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-01T18-54-53.186103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-01T18-54-53.186103.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_01T18_54_53.186103", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-01T18-54-53.186103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-01T18-54-53.186103.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_01T18_54_53.186103", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T18-54-53.186103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T18-54-53.186103.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_01T18_54_53.186103", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-01T18-54-53.186103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-01T18-54-53.186103.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_01T18_54_53.186103", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T18-54-53.186103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T18-54-53.186103.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_01T18_54_53.186103", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T18-54-53.186103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T18-54-53.186103.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_01T18_54_53.186103", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-01T18-54-53.186103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-01T18-54-53.186103.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_01T18_54_53.186103", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-01T18-54-53.186103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-01T18-54-53.186103.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_01T18_54_53.186103", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-01T18-54-53.186103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-01T18-54-53.186103.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_01T18_54_53.186103", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T18-54-53.186103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T18-54-53.186103.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_01T18_54_53.186103", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-01T18-54-53.186103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-01T18-54-53.186103.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_01T18_54_53.186103", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-01T18-54-53.186103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-01T18-54-53.186103.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_01T18_54_53.186103", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-01T18-54-53.186103.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-01T18-54-53.186103.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_01T18_54_53.186103", "path": ["**/details_harness|winogrande|5_2024-02-01T18-54-53.186103.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-01T18-54-53.186103.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_01T18_54_53.186103", "path": ["results_2024-02-01T18-54-53.186103.parquet"]}, {"split": "latest", "path": ["results_2024-02-01T18-54-53.186103.parquet"]}]}]}
2024-02-01T18:56:56+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of marcel/phi-2-openhermes-30k Dataset automatically created during the evaluation run of model marcel/phi-2-openhermes-30k on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-02-01T18:54:53.186103(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of marcel/phi-2-openhermes-30k\n\n\n\nDataset automatically created during the evaluation run of model marcel/phi-2-openhermes-30k on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-02-01T18:54:53.186103(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of marcel/phi-2-openhermes-30k\n\n\n\nDataset automatically created during the evaluation run of model marcel/phi-2-openhermes-30k on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-02-01T18:54:53.186103(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
2fb5742367c65c2d0975f1e583c0132110226089
# Dataset Card for Evaluation run of kwchoi/DPO_mistral_v01_7b_ultra_0130_1k <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [kwchoi/DPO_mistral_v01_7b_ultra_0130_1k](https://huggingface.co/kwchoi/DPO_mistral_v01_7b_ultra_0130_1k) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_kwchoi__DPO_mistral_v01_7b_ultra_0130_1k", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-02-01T19:22:26.231003](https://huggingface.co/datasets/open-llm-leaderboard/details_kwchoi__DPO_mistral_v01_7b_ultra_0130_1k/blob/main/results_2024-02-01T19-22-26.231003.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.5571875979808951, "acc_stderr": 0.03397913632814242, "acc_norm": 0.5626804775870592, "acc_norm_stderr": 0.03469852298707576, "mc1": 0.40269277845777235, "mc1_stderr": 0.01716883093518722, "mc2": 0.5562309711478401, "mc2_stderr": 0.01624026004957096 }, "harness|arc:challenge|25": { "acc": 0.5503412969283277, "acc_stderr": 0.01453714444428474, "acc_norm": 0.5716723549488054, "acc_norm_stderr": 0.014460496367599017 }, "harness|hellaswag|10": { "acc": 0.6116311491734714, "acc_stderr": 0.004863831364848073, "acc_norm": 0.791575383389763, "acc_norm_stderr": 0.004053518524584593 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.32, "acc_stderr": 0.046882617226215034, "acc_norm": 0.32, "acc_norm_stderr": 0.046882617226215034 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.43703703703703706, "acc_stderr": 0.04284958639753399, "acc_norm": 0.43703703703703706, "acc_norm_stderr": 0.04284958639753399 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.5526315789473685, "acc_stderr": 0.04046336883978252, "acc_norm": 0.5526315789473685, "acc_norm_stderr": 0.04046336883978252 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.54, "acc_stderr": 0.05009082659620332, "acc_norm": 0.54, "acc_norm_stderr": 0.05009082659620332 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.6037735849056604, "acc_stderr": 0.030102793781791194, "acc_norm": 0.6037735849056604, "acc_norm_stderr": 0.030102793781791194 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.6388888888888888, "acc_stderr": 0.04016660030451232, "acc_norm": 0.6388888888888888, "acc_norm_stderr": 0.04016660030451232 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.41, "acc_stderr": 0.04943110704237102, "acc_norm": 0.41, "acc_norm_stderr": 0.04943110704237102 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.54, "acc_stderr": 0.05009082659620333, "acc_norm": 0.54, "acc_norm_stderr": 0.05009082659620333 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.38, "acc_stderr": 0.048783173121456316, "acc_norm": 0.38, "acc_norm_stderr": 0.048783173121456316 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.5549132947976878, "acc_stderr": 0.037894017602836484, "acc_norm": 0.5549132947976878, "acc_norm_stderr": 0.037894017602836484 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.30392156862745096, "acc_stderr": 0.045766654032077636, "acc_norm": 0.30392156862745096, "acc_norm_stderr": 0.045766654032077636 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.71, "acc_stderr": 0.045604802157206845, "acc_norm": 0.71, "acc_norm_stderr": 0.045604802157206845 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.5276595744680851, "acc_stderr": 0.03263597118409769, "acc_norm": 0.5276595744680851, "acc_norm_stderr": 0.03263597118409769 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.38596491228070173, "acc_stderr": 0.04579639422070435, "acc_norm": 0.38596491228070173, "acc_norm_stderr": 0.04579639422070435 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5172413793103449, "acc_stderr": 0.04164188720169375, "acc_norm": 0.5172413793103449, "acc_norm_stderr": 0.04164188720169375 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.3386243386243386, "acc_stderr": 0.02437319786798306, "acc_norm": 0.3386243386243386, "acc_norm_stderr": 0.02437319786798306 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.3888888888888889, "acc_stderr": 0.04360314860077459, "acc_norm": 0.3888888888888889, "acc_norm_stderr": 0.04360314860077459 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.38, "acc_stderr": 0.048783173121456316, "acc_norm": 0.38, "acc_norm_stderr": 0.048783173121456316 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.6258064516129033, "acc_stderr": 0.027528904299845704, "acc_norm": 0.6258064516129033, "acc_norm_stderr": 0.027528904299845704 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.4039408866995074, "acc_stderr": 0.034524539038220406, "acc_norm": 0.4039408866995074, "acc_norm_stderr": 0.034524539038220406 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.54, "acc_stderr": 0.05009082659620332, "acc_norm": 0.54, "acc_norm_stderr": 0.05009082659620332 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.6848484848484848, "acc_stderr": 0.0362773057502241, "acc_norm": 0.6848484848484848, "acc_norm_stderr": 0.0362773057502241 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.702020202020202, "acc_stderr": 0.03258630383836556, "acc_norm": 0.702020202020202, "acc_norm_stderr": 0.03258630383836556 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.7461139896373057, "acc_stderr": 0.0314102478056532, "acc_norm": 0.7461139896373057, "acc_norm_stderr": 0.0314102478056532 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.5307692307692308, "acc_stderr": 0.025302958890850154, "acc_norm": 0.5307692307692308, "acc_norm_stderr": 0.025302958890850154 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.25925925925925924, "acc_stderr": 0.026719240783712152, "acc_norm": 0.25925925925925924, "acc_norm_stderr": 0.026719240783712152 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.5504201680672269, "acc_stderr": 0.03231293497137707, "acc_norm": 0.5504201680672269, "acc_norm_stderr": 0.03231293497137707 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.3576158940397351, "acc_stderr": 0.03913453431177258, "acc_norm": 0.3576158940397351, "acc_norm_stderr": 0.03913453431177258 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.7376146788990826, "acc_stderr": 0.01886188502153473, "acc_norm": 0.7376146788990826, "acc_norm_stderr": 0.01886188502153473 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.4444444444444444, "acc_stderr": 0.03388857118502326, "acc_norm": 0.4444444444444444, "acc_norm_stderr": 0.03388857118502326 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.7303921568627451, "acc_stderr": 0.031145570659486782, "acc_norm": 0.7303921568627451, "acc_norm_stderr": 0.031145570659486782 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.7130801687763713, "acc_stderr": 0.02944377302259469, "acc_norm": 0.7130801687763713, "acc_norm_stderr": 0.02944377302259469 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6502242152466368, "acc_stderr": 0.03200736719484503, "acc_norm": 0.6502242152466368, "acc_norm_stderr": 0.03200736719484503 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.6412213740458015, "acc_stderr": 0.04206739313864908, "acc_norm": 0.6412213740458015, "acc_norm_stderr": 0.04206739313864908 }, "harness|hendrycksTest-international_law|5": { "acc": 0.6942148760330579, "acc_stderr": 0.04205953933884122, "acc_norm": 0.6942148760330579, "acc_norm_stderr": 0.04205953933884122 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.6666666666666666, "acc_stderr": 0.04557239513497751, "acc_norm": 0.6666666666666666, "acc_norm_stderr": 0.04557239513497751 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.6932515337423313, "acc_stderr": 0.03623089915724146, "acc_norm": 0.6932515337423313, "acc_norm_stderr": 0.03623089915724146 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.4107142857142857, "acc_stderr": 0.04669510663875191, "acc_norm": 0.4107142857142857, "acc_norm_stderr": 0.04669510663875191 }, "harness|hendrycksTest-management|5": { "acc": 0.6893203883495146, "acc_stderr": 0.04582124160161551, "acc_norm": 0.6893203883495146, "acc_norm_stderr": 0.04582124160161551 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8461538461538461, "acc_stderr": 0.02363687331748929, "acc_norm": 0.8461538461538461, "acc_norm_stderr": 0.02363687331748929 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.63, "acc_stderr": 0.04852365870939099, "acc_norm": 0.63, "acc_norm_stderr": 0.04852365870939099 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.7471264367816092, "acc_stderr": 0.015543377313719681, "acc_norm": 0.7471264367816092, "acc_norm_stderr": 0.015543377313719681 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.615606936416185, "acc_stderr": 0.026189666966272035, "acc_norm": 0.615606936416185, "acc_norm_stderr": 0.026189666966272035 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.23575418994413408, "acc_stderr": 0.014196375686290804, "acc_norm": 0.23575418994413408, "acc_norm_stderr": 0.014196375686290804 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.6176470588235294, "acc_stderr": 0.027826109307283693, "acc_norm": 0.6176470588235294, "acc_norm_stderr": 0.027826109307283693 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.6141479099678456, "acc_stderr": 0.027648149599751464, "acc_norm": 0.6141479099678456, "acc_norm_stderr": 0.027648149599751464 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.6049382716049383, "acc_stderr": 0.027201117666925657, "acc_norm": 0.6049382716049383, "acc_norm_stderr": 0.027201117666925657 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.375886524822695, "acc_stderr": 0.028893955412115882, "acc_norm": 0.375886524822695, "acc_norm_stderr": 0.028893955412115882 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.4048239895697523, "acc_stderr": 0.012536743830954, "acc_norm": 0.4048239895697523, "acc_norm_stderr": 0.012536743830954 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.5551470588235294, "acc_stderr": 0.030187532060329383, "acc_norm": 0.5551470588235294, "acc_norm_stderr": 0.030187532060329383 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.5555555555555556, "acc_stderr": 0.020102583895887188, "acc_norm": 0.5555555555555556, "acc_norm_stderr": 0.020102583895887188 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6363636363636364, "acc_stderr": 0.04607582090719976, "acc_norm": 0.6363636363636364, "acc_norm_stderr": 0.04607582090719976 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.6448979591836734, "acc_stderr": 0.030635655150387638, "acc_norm": 0.6448979591836734, "acc_norm_stderr": 0.030635655150387638 }, "harness|hendrycksTest-sociology|5": { "acc": 0.746268656716418, "acc_stderr": 0.030769444967296018, "acc_norm": 0.746268656716418, "acc_norm_stderr": 0.030769444967296018 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.79, "acc_stderr": 0.040936018074033256, "acc_norm": 0.79, "acc_norm_stderr": 0.040936018074033256 }, "harness|hendrycksTest-virology|5": { "acc": 0.4879518072289157, "acc_stderr": 0.03891364495835821, "acc_norm": 0.4879518072289157, "acc_norm_stderr": 0.03891364495835821 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.7251461988304093, "acc_stderr": 0.03424042924691584, "acc_norm": 0.7251461988304093, "acc_norm_stderr": 0.03424042924691584 }, "harness|truthfulqa:mc|0": { "mc1": 0.40269277845777235, "mc1_stderr": 0.01716883093518722, "mc2": 0.5562309711478401, "mc2_stderr": 0.01624026004957096 }, "harness|winogrande|5": { "acc": 0.728492501973165, "acc_stderr": 0.012499326254893127 }, "harness|gsm8k|5": { "acc": 0.2630780894617134, "acc_stderr": 0.012128172607375929 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_kwchoi__DPO_mistral_v01_7b_ultra_0130_1k
[ "region:us" ]
2024-02-01T19:24:50+00:00
{"pretty_name": "Evaluation run of kwchoi/DPO_mistral_v01_7b_ultra_0130_1k", "dataset_summary": "Dataset automatically created during the evaluation run of model [kwchoi/DPO_mistral_v01_7b_ultra_0130_1k](https://huggingface.co/kwchoi/DPO_mistral_v01_7b_ultra_0130_1k) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_kwchoi__DPO_mistral_v01_7b_ultra_0130_1k\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-01T19:22:26.231003](https://huggingface.co/datasets/open-llm-leaderboard/details_kwchoi__DPO_mistral_v01_7b_ultra_0130_1k/blob/main/results_2024-02-01T19-22-26.231003.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5571875979808951,\n \"acc_stderr\": 0.03397913632814242,\n \"acc_norm\": 0.5626804775870592,\n \"acc_norm_stderr\": 0.03469852298707576,\n \"mc1\": 0.40269277845777235,\n \"mc1_stderr\": 0.01716883093518722,\n \"mc2\": 0.5562309711478401,\n \"mc2_stderr\": 0.01624026004957096\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5503412969283277,\n \"acc_stderr\": 0.01453714444428474,\n \"acc_norm\": 0.5716723549488054,\n \"acc_norm_stderr\": 0.014460496367599017\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6116311491734714,\n \"acc_stderr\": 0.004863831364848073,\n \"acc_norm\": 0.791575383389763,\n \"acc_norm_stderr\": 0.004053518524584593\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.43703703703703706,\n \"acc_stderr\": 0.04284958639753399,\n \"acc_norm\": 0.43703703703703706,\n \"acc_norm_stderr\": 0.04284958639753399\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.5526315789473685,\n \"acc_stderr\": 0.04046336883978252,\n \"acc_norm\": 0.5526315789473685,\n \"acc_norm_stderr\": 0.04046336883978252\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6037735849056604,\n \"acc_stderr\": 0.030102793781791194,\n \"acc_norm\": 0.6037735849056604,\n \"acc_norm_stderr\": 0.030102793781791194\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6388888888888888,\n \"acc_stderr\": 0.04016660030451232,\n \"acc_norm\": 0.6388888888888888,\n \"acc_norm_stderr\": 0.04016660030451232\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.41,\n \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5549132947976878,\n \"acc_stderr\": 0.037894017602836484,\n \"acc_norm\": 0.5549132947976878,\n \"acc_norm_stderr\": 0.037894017602836484\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.30392156862745096,\n \"acc_stderr\": 0.045766654032077636,\n \"acc_norm\": 0.30392156862745096,\n \"acc_norm_stderr\": 0.045766654032077636\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5276595744680851,\n \"acc_stderr\": 0.03263597118409769,\n \"acc_norm\": 0.5276595744680851,\n \"acc_norm_stderr\": 0.03263597118409769\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.38596491228070173,\n \"acc_stderr\": 0.04579639422070435,\n \"acc_norm\": 0.38596491228070173,\n \"acc_norm_stderr\": 0.04579639422070435\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.04164188720169375,\n \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.04164188720169375\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3386243386243386,\n \"acc_stderr\": 0.02437319786798306,\n \"acc_norm\": 0.3386243386243386,\n \"acc_norm_stderr\": 0.02437319786798306\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3888888888888889,\n \"acc_stderr\": 0.04360314860077459,\n \"acc_norm\": 0.3888888888888889,\n \"acc_norm_stderr\": 0.04360314860077459\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6258064516129033,\n \"acc_stderr\": 0.027528904299845704,\n \"acc_norm\": 0.6258064516129033,\n \"acc_norm_stderr\": 0.027528904299845704\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4039408866995074,\n \"acc_stderr\": 0.034524539038220406,\n \"acc_norm\": 0.4039408866995074,\n \"acc_norm_stderr\": 0.034524539038220406\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.6848484848484848,\n \"acc_stderr\": 0.0362773057502241,\n \"acc_norm\": 0.6848484848484848,\n \"acc_norm_stderr\": 0.0362773057502241\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.702020202020202,\n \"acc_stderr\": 0.03258630383836556,\n \"acc_norm\": 0.702020202020202,\n \"acc_norm_stderr\": 0.03258630383836556\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.7461139896373057,\n \"acc_stderr\": 0.0314102478056532,\n \"acc_norm\": 0.7461139896373057,\n \"acc_norm_stderr\": 0.0314102478056532\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.5307692307692308,\n \"acc_stderr\": 0.025302958890850154,\n \"acc_norm\": 0.5307692307692308,\n \"acc_norm_stderr\": 0.025302958890850154\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.25925925925925924,\n \"acc_stderr\": 0.026719240783712152,\n \"acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.026719240783712152\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.5504201680672269,\n \"acc_stderr\": 0.03231293497137707,\n \"acc_norm\": 0.5504201680672269,\n \"acc_norm_stderr\": 0.03231293497137707\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3576158940397351,\n \"acc_stderr\": 0.03913453431177258,\n \"acc_norm\": 0.3576158940397351,\n \"acc_norm_stderr\": 0.03913453431177258\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.7376146788990826,\n \"acc_stderr\": 0.01886188502153473,\n \"acc_norm\": 0.7376146788990826,\n \"acc_norm_stderr\": 0.01886188502153473\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4444444444444444,\n \"acc_stderr\": 0.03388857118502326,\n \"acc_norm\": 0.4444444444444444,\n \"acc_norm_stderr\": 0.03388857118502326\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7303921568627451,\n \"acc_stderr\": 0.031145570659486782,\n \"acc_norm\": 0.7303921568627451,\n \"acc_norm_stderr\": 0.031145570659486782\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7130801687763713,\n \"acc_stderr\": 0.02944377302259469,\n \"acc_norm\": 0.7130801687763713,\n \"acc_norm_stderr\": 0.02944377302259469\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6502242152466368,\n \"acc_stderr\": 0.03200736719484503,\n \"acc_norm\": 0.6502242152466368,\n \"acc_norm_stderr\": 0.03200736719484503\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.6412213740458015,\n \"acc_stderr\": 0.04206739313864908,\n \"acc_norm\": 0.6412213740458015,\n \"acc_norm_stderr\": 0.04206739313864908\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.6942148760330579,\n \"acc_stderr\": 0.04205953933884122,\n \"acc_norm\": 0.6942148760330579,\n \"acc_norm_stderr\": 0.04205953933884122\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.04557239513497751,\n \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.04557239513497751\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.6932515337423313,\n \"acc_stderr\": 0.03623089915724146,\n \"acc_norm\": 0.6932515337423313,\n \"acc_norm_stderr\": 0.03623089915724146\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4107142857142857,\n \"acc_stderr\": 0.04669510663875191,\n \"acc_norm\": 0.4107142857142857,\n \"acc_norm_stderr\": 0.04669510663875191\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.6893203883495146,\n \"acc_stderr\": 0.04582124160161551,\n \"acc_norm\": 0.6893203883495146,\n \"acc_norm_stderr\": 0.04582124160161551\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8461538461538461,\n \"acc_stderr\": 0.02363687331748929,\n \"acc_norm\": 0.8461538461538461,\n \"acc_norm_stderr\": 0.02363687331748929\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.63,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7471264367816092,\n \"acc_stderr\": 0.015543377313719681,\n \"acc_norm\": 0.7471264367816092,\n \"acc_norm_stderr\": 0.015543377313719681\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.615606936416185,\n \"acc_stderr\": 0.026189666966272035,\n \"acc_norm\": 0.615606936416185,\n \"acc_norm_stderr\": 0.026189666966272035\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23575418994413408,\n \"acc_stderr\": 0.014196375686290804,\n \"acc_norm\": 0.23575418994413408,\n \"acc_norm_stderr\": 0.014196375686290804\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6176470588235294,\n \"acc_stderr\": 0.027826109307283693,\n \"acc_norm\": 0.6176470588235294,\n \"acc_norm_stderr\": 0.027826109307283693\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6141479099678456,\n \"acc_stderr\": 0.027648149599751464,\n \"acc_norm\": 0.6141479099678456,\n \"acc_norm_stderr\": 0.027648149599751464\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.6049382716049383,\n \"acc_stderr\": 0.027201117666925657,\n \"acc_norm\": 0.6049382716049383,\n \"acc_norm_stderr\": 0.027201117666925657\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.375886524822695,\n \"acc_stderr\": 0.028893955412115882,\n \"acc_norm\": 0.375886524822695,\n \"acc_norm_stderr\": 0.028893955412115882\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4048239895697523,\n \"acc_stderr\": 0.012536743830954,\n \"acc_norm\": 0.4048239895697523,\n \"acc_norm_stderr\": 0.012536743830954\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.5551470588235294,\n \"acc_stderr\": 0.030187532060329383,\n \"acc_norm\": 0.5551470588235294,\n \"acc_norm_stderr\": 0.030187532060329383\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.5555555555555556,\n \"acc_stderr\": 0.020102583895887188,\n \"acc_norm\": 0.5555555555555556,\n \"acc_norm_stderr\": 0.020102583895887188\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6363636363636364,\n \"acc_stderr\": 0.04607582090719976,\n \"acc_norm\": 0.6363636363636364,\n \"acc_norm_stderr\": 0.04607582090719976\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.6448979591836734,\n \"acc_stderr\": 0.030635655150387638,\n \"acc_norm\": 0.6448979591836734,\n \"acc_norm_stderr\": 0.030635655150387638\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.746268656716418,\n \"acc_stderr\": 0.030769444967296018,\n \"acc_norm\": 0.746268656716418,\n \"acc_norm_stderr\": 0.030769444967296018\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.79,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4879518072289157,\n \"acc_stderr\": 0.03891364495835821,\n \"acc_norm\": 0.4879518072289157,\n \"acc_norm_stderr\": 0.03891364495835821\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.7251461988304093,\n \"acc_stderr\": 0.03424042924691584,\n \"acc_norm\": 0.7251461988304093,\n \"acc_norm_stderr\": 0.03424042924691584\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.40269277845777235,\n \"mc1_stderr\": 0.01716883093518722,\n \"mc2\": 0.5562309711478401,\n \"mc2_stderr\": 0.01624026004957096\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.728492501973165,\n \"acc_stderr\": 0.012499326254893127\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.2630780894617134,\n \"acc_stderr\": 0.012128172607375929\n }\n}\n```", "repo_url": "https://huggingface.co/kwchoi/DPO_mistral_v01_7b_ultra_0130_1k", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_01T19_22_26.231003", "path": ["**/details_harness|arc:challenge|25_2024-02-01T19-22-26.231003.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-01T19-22-26.231003.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_01T19_22_26.231003", "path": ["**/details_harness|gsm8k|5_2024-02-01T19-22-26.231003.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-01T19-22-26.231003.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_01T19_22_26.231003", "path": ["**/details_harness|hellaswag|10_2024-02-01T19-22-26.231003.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-01T19-22-26.231003.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_01T19_22_26.231003", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T19-22-26.231003.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-01T19-22-26.231003.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-01T19-22-26.231003.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T19-22-26.231003.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T19-22-26.231003.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-01T19-22-26.231003.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T19-22-26.231003.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T19-22-26.231003.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T19-22-26.231003.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T19-22-26.231003.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-01T19-22-26.231003.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-01T19-22-26.231003.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T19-22-26.231003.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-01T19-22-26.231003.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T19-22-26.231003.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T19-22-26.231003.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T19-22-26.231003.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-01T19-22-26.231003.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T19-22-26.231003.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T19-22-26.231003.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T19-22-26.231003.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T19-22-26.231003.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T19-22-26.231003.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T19-22-26.231003.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T19-22-26.231003.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T19-22-26.231003.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T19-22-26.231003.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T19-22-26.231003.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T19-22-26.231003.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T19-22-26.231003.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T19-22-26.231003.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T19-22-26.231003.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-01T19-22-26.231003.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T19-22-26.231003.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-01T19-22-26.231003.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T19-22-26.231003.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T19-22-26.231003.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T19-22-26.231003.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-01T19-22-26.231003.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-01T19-22-26.231003.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T19-22-26.231003.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T19-22-26.231003.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T19-22-26.231003.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T19-22-26.231003.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-01T19-22-26.231003.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-01T19-22-26.231003.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-01T19-22-26.231003.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T19-22-26.231003.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-01T19-22-26.231003.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T19-22-26.231003.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T19-22-26.231003.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-01T19-22-26.231003.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-01T19-22-26.231003.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-01T19-22-26.231003.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T19-22-26.231003.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-01T19-22-26.231003.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-01T19-22-26.231003.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T19-22-26.231003.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-01T19-22-26.231003.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-01T19-22-26.231003.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T19-22-26.231003.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T19-22-26.231003.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-01T19-22-26.231003.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T19-22-26.231003.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T19-22-26.231003.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T19-22-26.231003.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T19-22-26.231003.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-01T19-22-26.231003.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-01T19-22-26.231003.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T19-22-26.231003.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-01T19-22-26.231003.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T19-22-26.231003.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T19-22-26.231003.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T19-22-26.231003.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-01T19-22-26.231003.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T19-22-26.231003.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T19-22-26.231003.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T19-22-26.231003.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T19-22-26.231003.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T19-22-26.231003.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T19-22-26.231003.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T19-22-26.231003.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T19-22-26.231003.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T19-22-26.231003.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T19-22-26.231003.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T19-22-26.231003.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T19-22-26.231003.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T19-22-26.231003.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T19-22-26.231003.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-01T19-22-26.231003.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T19-22-26.231003.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-01T19-22-26.231003.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T19-22-26.231003.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T19-22-26.231003.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T19-22-26.231003.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-01T19-22-26.231003.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-01T19-22-26.231003.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T19-22-26.231003.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T19-22-26.231003.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T19-22-26.231003.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T19-22-26.231003.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-01T19-22-26.231003.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-01T19-22-26.231003.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-01T19-22-26.231003.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T19-22-26.231003.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-01T19-22-26.231003.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T19-22-26.231003.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T19-22-26.231003.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-01T19-22-26.231003.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-01T19-22-26.231003.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-01T19-22-26.231003.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T19-22-26.231003.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-01T19-22-26.231003.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-01T19-22-26.231003.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_01T19_22_26.231003", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T19-22-26.231003.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T19-22-26.231003.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_01T19_22_26.231003", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-01T19-22-26.231003.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-01T19-22-26.231003.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_01T19_22_26.231003", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-01T19-22-26.231003.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-01T19-22-26.231003.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_01T19_22_26.231003", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T19-22-26.231003.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T19-22-26.231003.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_01T19_22_26.231003", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T19-22-26.231003.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T19-22-26.231003.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_01T19_22_26.231003", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-01T19-22-26.231003.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-01T19-22-26.231003.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_01T19_22_26.231003", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T19-22-26.231003.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T19-22-26.231003.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_01T19_22_26.231003", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T19-22-26.231003.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T19-22-26.231003.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_01T19_22_26.231003", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T19-22-26.231003.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T19-22-26.231003.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_01T19_22_26.231003", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T19-22-26.231003.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T19-22-26.231003.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_01T19_22_26.231003", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-01T19-22-26.231003.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-01T19-22-26.231003.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_01T19_22_26.231003", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-01T19-22-26.231003.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-01T19-22-26.231003.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_01T19_22_26.231003", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T19-22-26.231003.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T19-22-26.231003.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_01T19_22_26.231003", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-01T19-22-26.231003.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-01T19-22-26.231003.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_01T19_22_26.231003", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T19-22-26.231003.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T19-22-26.231003.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_01T19_22_26.231003", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T19-22-26.231003.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T19-22-26.231003.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_01T19_22_26.231003", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T19-22-26.231003.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T19-22-26.231003.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_01T19_22_26.231003", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-01T19-22-26.231003.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-01T19-22-26.231003.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_01T19_22_26.231003", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T19-22-26.231003.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T19-22-26.231003.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_01T19_22_26.231003", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T19-22-26.231003.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T19-22-26.231003.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_01T19_22_26.231003", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T19-22-26.231003.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T19-22-26.231003.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_01T19_22_26.231003", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T19-22-26.231003.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T19-22-26.231003.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_01T19_22_26.231003", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T19-22-26.231003.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T19-22-26.231003.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_01T19_22_26.231003", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T19-22-26.231003.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T19-22-26.231003.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_01T19_22_26.231003", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T19-22-26.231003.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T19-22-26.231003.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_01T19_22_26.231003", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T19-22-26.231003.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T19-22-26.231003.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_01T19_22_26.231003", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T19-22-26.231003.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T19-22-26.231003.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_01T19_22_26.231003", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T19-22-26.231003.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T19-22-26.231003.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_01T19_22_26.231003", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T19-22-26.231003.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T19-22-26.231003.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_01T19_22_26.231003", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T19-22-26.231003.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T19-22-26.231003.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_01T19_22_26.231003", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T19-22-26.231003.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T19-22-26.231003.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_01T19_22_26.231003", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T19-22-26.231003.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T19-22-26.231003.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_01T19_22_26.231003", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-01T19-22-26.231003.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-01T19-22-26.231003.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_01T19_22_26.231003", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T19-22-26.231003.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T19-22-26.231003.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_01T19_22_26.231003", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-01T19-22-26.231003.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-01T19-22-26.231003.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_01T19_22_26.231003", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T19-22-26.231003.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T19-22-26.231003.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_01T19_22_26.231003", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T19-22-26.231003.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T19-22-26.231003.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_01T19_22_26.231003", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T19-22-26.231003.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T19-22-26.231003.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_01T19_22_26.231003", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-01T19-22-26.231003.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-01T19-22-26.231003.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_01T19_22_26.231003", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-01T19-22-26.231003.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-01T19-22-26.231003.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_01T19_22_26.231003", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T19-22-26.231003.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T19-22-26.231003.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_01T19_22_26.231003", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T19-22-26.231003.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T19-22-26.231003.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_01T19_22_26.231003", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T19-22-26.231003.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T19-22-26.231003.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_01T19_22_26.231003", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T19-22-26.231003.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T19-22-26.231003.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_01T19_22_26.231003", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-01T19-22-26.231003.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-01T19-22-26.231003.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_01T19_22_26.231003", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-01T19-22-26.231003.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-01T19-22-26.231003.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_01T19_22_26.231003", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-01T19-22-26.231003.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-01T19-22-26.231003.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_01T19_22_26.231003", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T19-22-26.231003.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T19-22-26.231003.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_01T19_22_26.231003", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-01T19-22-26.231003.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-01T19-22-26.231003.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_01T19_22_26.231003", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T19-22-26.231003.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T19-22-26.231003.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_01T19_22_26.231003", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T19-22-26.231003.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T19-22-26.231003.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_01T19_22_26.231003", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-01T19-22-26.231003.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-01T19-22-26.231003.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_01T19_22_26.231003", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-01T19-22-26.231003.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-01T19-22-26.231003.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_01T19_22_26.231003", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-01T19-22-26.231003.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-01T19-22-26.231003.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_01T19_22_26.231003", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T19-22-26.231003.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T19-22-26.231003.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_01T19_22_26.231003", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-01T19-22-26.231003.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-01T19-22-26.231003.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_01T19_22_26.231003", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-01T19-22-26.231003.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-01T19-22-26.231003.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_01T19_22_26.231003", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-01T19-22-26.231003.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-01T19-22-26.231003.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_01T19_22_26.231003", "path": ["**/details_harness|winogrande|5_2024-02-01T19-22-26.231003.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-01T19-22-26.231003.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_01T19_22_26.231003", "path": ["results_2024-02-01T19-22-26.231003.parquet"]}, {"split": "latest", "path": ["results_2024-02-01T19-22-26.231003.parquet"]}]}]}
2024-02-01T19:25:13+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of kwchoi/DPO_mistral_v01_7b_ultra_0130_1k Dataset automatically created during the evaluation run of model kwchoi/DPO_mistral_v01_7b_ultra_0130_1k on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-02-01T19:22:26.231003(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of kwchoi/DPO_mistral_v01_7b_ultra_0130_1k\n\n\n\nDataset automatically created during the evaluation run of model kwchoi/DPO_mistral_v01_7b_ultra_0130_1k on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-02-01T19:22:26.231003(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of kwchoi/DPO_mistral_v01_7b_ultra_0130_1k\n\n\n\nDataset automatically created during the evaluation run of model kwchoi/DPO_mistral_v01_7b_ultra_0130_1k on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-02-01T19:22:26.231003(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
b62d15f00c3a7c99eb3a08e95a19f295a3abceeb
# Dataset Card for Evaluation run of ibivibiv/aegolius-acadicus-v1-30b <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [ibivibiv/aegolius-acadicus-v1-30b](https://huggingface.co/ibivibiv/aegolius-acadicus-v1-30b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_ibivibiv__aegolius-acadicus-v1-30b", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-02-01T19:29:17.332742](https://huggingface.co/datasets/open-llm-leaderboard/details_ibivibiv__aegolius-acadicus-v1-30b/blob/main/results_2024-02-01T19-29-17.332742.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6570400877964149, "acc_stderr": 0.03203921962388101, "acc_norm": 0.6562461966631471, "acc_norm_stderr": 0.032714406862803615, "mc1": 0.5177478580171359, "mc1_stderr": 0.017492470843075356, "mc2": 0.6706345851826353, "mc2_stderr": 0.015138706281704897 }, "harness|arc:challenge|25": { "acc": 0.7022184300341296, "acc_stderr": 0.013363080107244485, "acc_norm": 0.7261092150170648, "acc_norm_stderr": 0.013032004972989506 }, "harness|hellaswag|10": { "acc": 0.7104162517426807, "acc_stderr": 0.004526422125860671, "acc_norm": 0.8799044015136427, "acc_norm_stderr": 0.0032440893478294422 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.34, "acc_stderr": 0.04760952285695235, "acc_norm": 0.34, "acc_norm_stderr": 0.04760952285695235 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.6666666666666666, "acc_stderr": 0.04072314811876837, "acc_norm": 0.6666666666666666, "acc_norm_stderr": 0.04072314811876837 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.6907894736842105, "acc_stderr": 0.037610708698674805, "acc_norm": 0.6907894736842105, "acc_norm_stderr": 0.037610708698674805 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.65, "acc_stderr": 0.0479372485441102, "acc_norm": 0.65, "acc_norm_stderr": 0.0479372485441102 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.7132075471698113, "acc_stderr": 0.027834912527544057, "acc_norm": 0.7132075471698113, "acc_norm_stderr": 0.027834912527544057 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.7708333333333334, "acc_stderr": 0.03514697467862388, "acc_norm": 0.7708333333333334, "acc_norm_stderr": 0.03514697467862388 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.48, "acc_stderr": 0.050211673156867795, "acc_norm": 0.48, "acc_norm_stderr": 0.050211673156867795 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.53, "acc_stderr": 0.05016135580465919, "acc_norm": 0.53, "acc_norm_stderr": 0.05016135580465919 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.33, "acc_stderr": 0.047258156262526045, "acc_norm": 0.33, "acc_norm_stderr": 0.047258156262526045 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6763005780346821, "acc_stderr": 0.0356760379963917, "acc_norm": 0.6763005780346821, "acc_norm_stderr": 0.0356760379963917 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.4117647058823529, "acc_stderr": 0.048971049527263666, "acc_norm": 0.4117647058823529, "acc_norm_stderr": 0.048971049527263666 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.77, "acc_stderr": 0.042295258468165065, "acc_norm": 0.77, "acc_norm_stderr": 0.042295258468165065 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.5787234042553191, "acc_stderr": 0.03227834510146268, "acc_norm": 0.5787234042553191, "acc_norm_stderr": 0.03227834510146268 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.5087719298245614, "acc_stderr": 0.04702880432049615, "acc_norm": 0.5087719298245614, "acc_norm_stderr": 0.04702880432049615 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5517241379310345, "acc_stderr": 0.04144311810878152, "acc_norm": 0.5517241379310345, "acc_norm_stderr": 0.04144311810878152 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.42328042328042326, "acc_stderr": 0.025446365634406783, "acc_norm": 0.42328042328042326, "acc_norm_stderr": 0.025446365634406783 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.48412698412698413, "acc_stderr": 0.04469881854072606, "acc_norm": 0.48412698412698413, "acc_norm_stderr": 0.04469881854072606 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.37, "acc_stderr": 0.04852365870939099, "acc_norm": 0.37, "acc_norm_stderr": 0.04852365870939099 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.7806451612903226, "acc_stderr": 0.023540799358723302, "acc_norm": 0.7806451612903226, "acc_norm_stderr": 0.023540799358723302 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.4975369458128079, "acc_stderr": 0.03517945038691063, "acc_norm": 0.4975369458128079, "acc_norm_stderr": 0.03517945038691063 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.69, "acc_stderr": 0.04648231987117316, "acc_norm": 0.69, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7757575757575758, "acc_stderr": 0.03256866661681102, "acc_norm": 0.7757575757575758, "acc_norm_stderr": 0.03256866661681102 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.803030303030303, "acc_stderr": 0.028335609732463362, "acc_norm": 0.803030303030303, "acc_norm_stderr": 0.028335609732463362 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.8963730569948186, "acc_stderr": 0.021995311963644237, "acc_norm": 0.8963730569948186, "acc_norm_stderr": 0.021995311963644237 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.6666666666666666, "acc_stderr": 0.023901157979402534, "acc_norm": 0.6666666666666666, "acc_norm_stderr": 0.023901157979402534 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.36666666666666664, "acc_stderr": 0.029381620726465066, "acc_norm": 0.36666666666666664, "acc_norm_stderr": 0.029381620726465066 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.6890756302521008, "acc_stderr": 0.030066761582977938, "acc_norm": 0.6890756302521008, "acc_norm_stderr": 0.030066761582977938 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.3708609271523179, "acc_stderr": 0.03943966699183629, "acc_norm": 0.3708609271523179, "acc_norm_stderr": 0.03943966699183629 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8403669724770643, "acc_stderr": 0.015703498348461783, "acc_norm": 0.8403669724770643, "acc_norm_stderr": 0.015703498348461783 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.5, "acc_stderr": 0.034099716973523674, "acc_norm": 0.5, "acc_norm_stderr": 0.034099716973523674 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.8480392156862745, "acc_stderr": 0.0251956584289318, "acc_norm": 0.8480392156862745, "acc_norm_stderr": 0.0251956584289318 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.8059071729957806, "acc_stderr": 0.025744902532290902, "acc_norm": 0.8059071729957806, "acc_norm_stderr": 0.025744902532290902 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6995515695067265, "acc_stderr": 0.03076935200822914, "acc_norm": 0.6995515695067265, "acc_norm_stderr": 0.03076935200822914 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.7938931297709924, "acc_stderr": 0.03547771004159465, "acc_norm": 0.7938931297709924, "acc_norm_stderr": 0.03547771004159465 }, "harness|hendrycksTest-international_law|5": { "acc": 0.8099173553719008, "acc_stderr": 0.03581796951709282, "acc_norm": 0.8099173553719008, "acc_norm_stderr": 0.03581796951709282 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7870370370370371, "acc_stderr": 0.0395783547198098, "acc_norm": 0.7870370370370371, "acc_norm_stderr": 0.0395783547198098 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7484662576687117, "acc_stderr": 0.03408997886857529, "acc_norm": 0.7484662576687117, "acc_norm_stderr": 0.03408997886857529 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.4375, "acc_stderr": 0.04708567521880525, "acc_norm": 0.4375, "acc_norm_stderr": 0.04708567521880525 }, "harness|hendrycksTest-management|5": { "acc": 0.7766990291262136, "acc_stderr": 0.04123553189891431, "acc_norm": 0.7766990291262136, "acc_norm_stderr": 0.04123553189891431 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8888888888888888, "acc_stderr": 0.020588491316092375, "acc_norm": 0.8888888888888888, "acc_norm_stderr": 0.020588491316092375 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.72, "acc_stderr": 0.045126085985421276, "acc_norm": 0.72, "acc_norm_stderr": 0.045126085985421276 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8314176245210728, "acc_stderr": 0.013387895731543604, "acc_norm": 0.8314176245210728, "acc_norm_stderr": 0.013387895731543604 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.7398843930635838, "acc_stderr": 0.023618678310069356, "acc_norm": 0.7398843930635838, "acc_norm_stderr": 0.023618678310069356 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.4335195530726257, "acc_stderr": 0.016574027219517635, "acc_norm": 0.4335195530726257, "acc_norm_stderr": 0.016574027219517635 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.7189542483660131, "acc_stderr": 0.025738854797818733, "acc_norm": 0.7189542483660131, "acc_norm_stderr": 0.025738854797818733 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.7041800643086816, "acc_stderr": 0.025922371788818763, "acc_norm": 0.7041800643086816, "acc_norm_stderr": 0.025922371788818763 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.7469135802469136, "acc_stderr": 0.024191808600712995, "acc_norm": 0.7469135802469136, "acc_norm_stderr": 0.024191808600712995 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.5035460992907801, "acc_stderr": 0.02982674915328092, "acc_norm": 0.5035460992907801, "acc_norm_stderr": 0.02982674915328092 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.47327249022164275, "acc_stderr": 0.012751977967676008, "acc_norm": 0.47327249022164275, "acc_norm_stderr": 0.012751977967676008 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.6727941176470589, "acc_stderr": 0.02850145286039656, "acc_norm": 0.6727941176470589, "acc_norm_stderr": 0.02850145286039656 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6666666666666666, "acc_stderr": 0.019070985589687495, "acc_norm": 0.6666666666666666, "acc_norm_stderr": 0.019070985589687495 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6727272727272727, "acc_stderr": 0.0449429086625209, "acc_norm": 0.6727272727272727, "acc_norm_stderr": 0.0449429086625209 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.7224489795918367, "acc_stderr": 0.02866685779027465, "acc_norm": 0.7224489795918367, "acc_norm_stderr": 0.02866685779027465 }, "harness|hendrycksTest-sociology|5": { "acc": 0.8407960199004975, "acc_stderr": 0.025870646766169136, "acc_norm": 0.8407960199004975, "acc_norm_stderr": 0.025870646766169136 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.85, "acc_stderr": 0.035887028128263686, "acc_norm": 0.85, "acc_norm_stderr": 0.035887028128263686 }, "harness|hendrycksTest-virology|5": { "acc": 0.5542168674698795, "acc_stderr": 0.03869543323472101, "acc_norm": 0.5542168674698795, "acc_norm_stderr": 0.03869543323472101 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8421052631578947, "acc_stderr": 0.027966785859160893, "acc_norm": 0.8421052631578947, "acc_norm_stderr": 0.027966785859160893 }, "harness|truthfulqa:mc|0": { "mc1": 0.5177478580171359, "mc1_stderr": 0.017492470843075356, "mc2": 0.6706345851826353, "mc2_stderr": 0.015138706281704897 }, "harness|winogrande|5": { "acc": 0.8484609313338595, "acc_stderr": 0.010077698907571778 }, "harness|gsm8k|5": { "acc": 0.7058377558756633, "acc_stderr": 0.01255128533147015 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_ibivibiv__aegolius-acadicus-v1-30b
[ "region:us" ]
2024-02-01T19:31:38+00:00
{"pretty_name": "Evaluation run of ibivibiv/aegolius-acadicus-v1-30b", "dataset_summary": "Dataset automatically created during the evaluation run of model [ibivibiv/aegolius-acadicus-v1-30b](https://huggingface.co/ibivibiv/aegolius-acadicus-v1-30b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_ibivibiv__aegolius-acadicus-v1-30b\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-01T19:29:17.332742](https://huggingface.co/datasets/open-llm-leaderboard/details_ibivibiv__aegolius-acadicus-v1-30b/blob/main/results_2024-02-01T19-29-17.332742.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6570400877964149,\n \"acc_stderr\": 0.03203921962388101,\n \"acc_norm\": 0.6562461966631471,\n \"acc_norm_stderr\": 0.032714406862803615,\n \"mc1\": 0.5177478580171359,\n \"mc1_stderr\": 0.017492470843075356,\n \"mc2\": 0.6706345851826353,\n \"mc2_stderr\": 0.015138706281704897\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.7022184300341296,\n \"acc_stderr\": 0.013363080107244485,\n \"acc_norm\": 0.7261092150170648,\n \"acc_norm_stderr\": 0.013032004972989506\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7104162517426807,\n \"acc_stderr\": 0.004526422125860671,\n \"acc_norm\": 0.8799044015136427,\n \"acc_norm_stderr\": 0.0032440893478294422\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.04072314811876837,\n \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.04072314811876837\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6907894736842105,\n \"acc_stderr\": 0.037610708698674805,\n \"acc_norm\": 0.6907894736842105,\n \"acc_norm_stderr\": 0.037610708698674805\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.65,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7132075471698113,\n \"acc_stderr\": 0.027834912527544057,\n \"acc_norm\": 0.7132075471698113,\n \"acc_norm_stderr\": 0.027834912527544057\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7708333333333334,\n \"acc_stderr\": 0.03514697467862388,\n \"acc_norm\": 0.7708333333333334,\n \"acc_norm_stderr\": 0.03514697467862388\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6763005780346821,\n \"acc_stderr\": 0.0356760379963917,\n \"acc_norm\": 0.6763005780346821,\n \"acc_norm_stderr\": 0.0356760379963917\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.048971049527263666,\n \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.048971049527263666\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.77,\n \"acc_stderr\": 0.042295258468165065,\n \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.042295258468165065\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5787234042553191,\n \"acc_stderr\": 0.03227834510146268,\n \"acc_norm\": 0.5787234042553191,\n \"acc_norm_stderr\": 0.03227834510146268\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5087719298245614,\n \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.5087719298245614,\n \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5517241379310345,\n \"acc_stderr\": 0.04144311810878152,\n \"acc_norm\": 0.5517241379310345,\n \"acc_norm_stderr\": 0.04144311810878152\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.42328042328042326,\n \"acc_stderr\": 0.025446365634406783,\n \"acc_norm\": 0.42328042328042326,\n \"acc_norm_stderr\": 0.025446365634406783\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.48412698412698413,\n \"acc_stderr\": 0.04469881854072606,\n \"acc_norm\": 0.48412698412698413,\n \"acc_norm_stderr\": 0.04469881854072606\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7806451612903226,\n \"acc_stderr\": 0.023540799358723302,\n \"acc_norm\": 0.7806451612903226,\n \"acc_norm_stderr\": 0.023540799358723302\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4975369458128079,\n \"acc_stderr\": 0.03517945038691063,\n \"acc_norm\": 0.4975369458128079,\n \"acc_norm_stderr\": 0.03517945038691063\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.03256866661681102,\n \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.03256866661681102\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.803030303030303,\n \"acc_stderr\": 0.028335609732463362,\n \"acc_norm\": 0.803030303030303,\n \"acc_norm_stderr\": 0.028335609732463362\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8963730569948186,\n \"acc_stderr\": 0.021995311963644237,\n \"acc_norm\": 0.8963730569948186,\n \"acc_norm_stderr\": 0.021995311963644237\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.023901157979402534,\n \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.023901157979402534\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.36666666666666664,\n \"acc_stderr\": 0.029381620726465066,\n \"acc_norm\": 0.36666666666666664,\n \"acc_norm_stderr\": 0.029381620726465066\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6890756302521008,\n \"acc_stderr\": 0.030066761582977938,\n \"acc_norm\": 0.6890756302521008,\n \"acc_norm_stderr\": 0.030066761582977938\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3708609271523179,\n \"acc_stderr\": 0.03943966699183629,\n \"acc_norm\": 0.3708609271523179,\n \"acc_norm_stderr\": 0.03943966699183629\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8403669724770643,\n \"acc_stderr\": 0.015703498348461783,\n \"acc_norm\": 0.8403669724770643,\n \"acc_norm_stderr\": 0.015703498348461783\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.034099716973523674,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.034099716973523674\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8480392156862745,\n \"acc_stderr\": 0.0251956584289318,\n \"acc_norm\": 0.8480392156862745,\n \"acc_norm_stderr\": 0.0251956584289318\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8059071729957806,\n \"acc_stderr\": 0.025744902532290902,\n \"acc_norm\": 0.8059071729957806,\n \"acc_norm_stderr\": 0.025744902532290902\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6995515695067265,\n \"acc_stderr\": 0.03076935200822914,\n \"acc_norm\": 0.6995515695067265,\n \"acc_norm_stderr\": 0.03076935200822914\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7938931297709924,\n \"acc_stderr\": 0.03547771004159465,\n \"acc_norm\": 0.7938931297709924,\n \"acc_norm_stderr\": 0.03547771004159465\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8099173553719008,\n \"acc_stderr\": 0.03581796951709282,\n \"acc_norm\": 0.8099173553719008,\n \"acc_norm_stderr\": 0.03581796951709282\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n \"acc_stderr\": 0.0395783547198098,\n \"acc_norm\": 0.7870370370370371,\n \"acc_norm_stderr\": 0.0395783547198098\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7484662576687117,\n \"acc_stderr\": 0.03408997886857529,\n \"acc_norm\": 0.7484662576687117,\n \"acc_norm_stderr\": 0.03408997886857529\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4375,\n \"acc_stderr\": 0.04708567521880525,\n \"acc_norm\": 0.4375,\n \"acc_norm_stderr\": 0.04708567521880525\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8888888888888888,\n \"acc_stderr\": 0.020588491316092375,\n \"acc_norm\": 0.8888888888888888,\n \"acc_norm_stderr\": 0.020588491316092375\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8314176245210728,\n \"acc_stderr\": 0.013387895731543604,\n \"acc_norm\": 0.8314176245210728,\n \"acc_norm_stderr\": 0.013387895731543604\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7398843930635838,\n \"acc_stderr\": 0.023618678310069356,\n \"acc_norm\": 0.7398843930635838,\n \"acc_norm_stderr\": 0.023618678310069356\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4335195530726257,\n \"acc_stderr\": 0.016574027219517635,\n \"acc_norm\": 0.4335195530726257,\n \"acc_norm_stderr\": 0.016574027219517635\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7189542483660131,\n \"acc_stderr\": 0.025738854797818733,\n \"acc_norm\": 0.7189542483660131,\n \"acc_norm_stderr\": 0.025738854797818733\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7041800643086816,\n \"acc_stderr\": 0.025922371788818763,\n \"acc_norm\": 0.7041800643086816,\n \"acc_norm_stderr\": 0.025922371788818763\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7469135802469136,\n \"acc_stderr\": 0.024191808600712995,\n \"acc_norm\": 0.7469135802469136,\n \"acc_norm_stderr\": 0.024191808600712995\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.5035460992907801,\n \"acc_stderr\": 0.02982674915328092,\n \"acc_norm\": 0.5035460992907801,\n \"acc_norm_stderr\": 0.02982674915328092\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.47327249022164275,\n \"acc_stderr\": 0.012751977967676008,\n \"acc_norm\": 0.47327249022164275,\n \"acc_norm_stderr\": 0.012751977967676008\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6727941176470589,\n \"acc_stderr\": 0.02850145286039656,\n \"acc_norm\": 0.6727941176470589,\n \"acc_norm_stderr\": 0.02850145286039656\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.019070985589687495,\n \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.019070985589687495\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7224489795918367,\n \"acc_stderr\": 0.02866685779027465,\n \"acc_norm\": 0.7224489795918367,\n \"acc_norm_stderr\": 0.02866685779027465\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n \"acc_stderr\": 0.025870646766169136,\n \"acc_norm\": 0.8407960199004975,\n \"acc_norm_stderr\": 0.025870646766169136\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.85,\n \"acc_stderr\": 0.035887028128263686,\n \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.035887028128263686\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5542168674698795,\n \"acc_stderr\": 0.03869543323472101,\n \"acc_norm\": 0.5542168674698795,\n \"acc_norm_stderr\": 0.03869543323472101\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8421052631578947,\n \"acc_stderr\": 0.027966785859160893,\n \"acc_norm\": 0.8421052631578947,\n \"acc_norm_stderr\": 0.027966785859160893\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5177478580171359,\n \"mc1_stderr\": 0.017492470843075356,\n \"mc2\": 0.6706345851826353,\n \"mc2_stderr\": 0.015138706281704897\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8484609313338595,\n \"acc_stderr\": 0.010077698907571778\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7058377558756633,\n \"acc_stderr\": 0.01255128533147015\n }\n}\n```", "repo_url": "https://huggingface.co/ibivibiv/aegolius-acadicus-v1-30b", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_01T19_29_17.332742", "path": ["**/details_harness|arc:challenge|25_2024-02-01T19-29-17.332742.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-01T19-29-17.332742.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_01T19_29_17.332742", "path": ["**/details_harness|gsm8k|5_2024-02-01T19-29-17.332742.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-01T19-29-17.332742.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_01T19_29_17.332742", "path": ["**/details_harness|hellaswag|10_2024-02-01T19-29-17.332742.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-01T19-29-17.332742.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_01T19_29_17.332742", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T19-29-17.332742.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-01T19-29-17.332742.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-01T19-29-17.332742.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T19-29-17.332742.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T19-29-17.332742.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-01T19-29-17.332742.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T19-29-17.332742.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T19-29-17.332742.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T19-29-17.332742.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T19-29-17.332742.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-01T19-29-17.332742.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-01T19-29-17.332742.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T19-29-17.332742.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-01T19-29-17.332742.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T19-29-17.332742.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T19-29-17.332742.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T19-29-17.332742.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-01T19-29-17.332742.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T19-29-17.332742.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T19-29-17.332742.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T19-29-17.332742.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T19-29-17.332742.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T19-29-17.332742.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T19-29-17.332742.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T19-29-17.332742.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T19-29-17.332742.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T19-29-17.332742.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T19-29-17.332742.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T19-29-17.332742.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T19-29-17.332742.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T19-29-17.332742.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T19-29-17.332742.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-01T19-29-17.332742.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T19-29-17.332742.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-01T19-29-17.332742.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T19-29-17.332742.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T19-29-17.332742.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T19-29-17.332742.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-01T19-29-17.332742.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-01T19-29-17.332742.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T19-29-17.332742.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T19-29-17.332742.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T19-29-17.332742.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T19-29-17.332742.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-01T19-29-17.332742.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-01T19-29-17.332742.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-01T19-29-17.332742.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T19-29-17.332742.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-01T19-29-17.332742.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T19-29-17.332742.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T19-29-17.332742.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-01T19-29-17.332742.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-01T19-29-17.332742.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-01T19-29-17.332742.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T19-29-17.332742.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-01T19-29-17.332742.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-01T19-29-17.332742.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T19-29-17.332742.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-01T19-29-17.332742.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-01T19-29-17.332742.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T19-29-17.332742.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T19-29-17.332742.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-01T19-29-17.332742.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T19-29-17.332742.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T19-29-17.332742.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T19-29-17.332742.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T19-29-17.332742.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-01T19-29-17.332742.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-01T19-29-17.332742.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T19-29-17.332742.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-01T19-29-17.332742.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T19-29-17.332742.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T19-29-17.332742.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T19-29-17.332742.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-01T19-29-17.332742.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T19-29-17.332742.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T19-29-17.332742.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T19-29-17.332742.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T19-29-17.332742.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T19-29-17.332742.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T19-29-17.332742.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T19-29-17.332742.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T19-29-17.332742.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T19-29-17.332742.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T19-29-17.332742.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T19-29-17.332742.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T19-29-17.332742.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T19-29-17.332742.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T19-29-17.332742.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-01T19-29-17.332742.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T19-29-17.332742.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-01T19-29-17.332742.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T19-29-17.332742.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T19-29-17.332742.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T19-29-17.332742.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-01T19-29-17.332742.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-01T19-29-17.332742.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T19-29-17.332742.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T19-29-17.332742.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T19-29-17.332742.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T19-29-17.332742.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-01T19-29-17.332742.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-01T19-29-17.332742.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-01T19-29-17.332742.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T19-29-17.332742.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-01T19-29-17.332742.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T19-29-17.332742.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T19-29-17.332742.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-01T19-29-17.332742.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-01T19-29-17.332742.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-01T19-29-17.332742.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T19-29-17.332742.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-01T19-29-17.332742.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-01T19-29-17.332742.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_01T19_29_17.332742", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T19-29-17.332742.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T19-29-17.332742.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_01T19_29_17.332742", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-01T19-29-17.332742.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-01T19-29-17.332742.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_01T19_29_17.332742", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-01T19-29-17.332742.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-01T19-29-17.332742.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_01T19_29_17.332742", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T19-29-17.332742.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T19-29-17.332742.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_01T19_29_17.332742", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T19-29-17.332742.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T19-29-17.332742.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_01T19_29_17.332742", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-01T19-29-17.332742.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-01T19-29-17.332742.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_01T19_29_17.332742", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T19-29-17.332742.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T19-29-17.332742.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_01T19_29_17.332742", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T19-29-17.332742.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T19-29-17.332742.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_01T19_29_17.332742", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T19-29-17.332742.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T19-29-17.332742.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_01T19_29_17.332742", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T19-29-17.332742.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T19-29-17.332742.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_01T19_29_17.332742", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-01T19-29-17.332742.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-01T19-29-17.332742.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_01T19_29_17.332742", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-01T19-29-17.332742.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-01T19-29-17.332742.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_01T19_29_17.332742", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T19-29-17.332742.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T19-29-17.332742.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_01T19_29_17.332742", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-01T19-29-17.332742.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-01T19-29-17.332742.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_01T19_29_17.332742", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T19-29-17.332742.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T19-29-17.332742.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_01T19_29_17.332742", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T19-29-17.332742.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T19-29-17.332742.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_01T19_29_17.332742", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T19-29-17.332742.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T19-29-17.332742.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_01T19_29_17.332742", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-01T19-29-17.332742.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-01T19-29-17.332742.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_01T19_29_17.332742", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T19-29-17.332742.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T19-29-17.332742.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_01T19_29_17.332742", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T19-29-17.332742.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T19-29-17.332742.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_01T19_29_17.332742", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T19-29-17.332742.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T19-29-17.332742.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_01T19_29_17.332742", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T19-29-17.332742.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T19-29-17.332742.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_01T19_29_17.332742", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T19-29-17.332742.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T19-29-17.332742.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_01T19_29_17.332742", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T19-29-17.332742.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T19-29-17.332742.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_01T19_29_17.332742", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T19-29-17.332742.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T19-29-17.332742.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_01T19_29_17.332742", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T19-29-17.332742.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T19-29-17.332742.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_01T19_29_17.332742", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T19-29-17.332742.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T19-29-17.332742.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_01T19_29_17.332742", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T19-29-17.332742.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T19-29-17.332742.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_01T19_29_17.332742", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T19-29-17.332742.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T19-29-17.332742.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_01T19_29_17.332742", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T19-29-17.332742.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T19-29-17.332742.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_01T19_29_17.332742", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T19-29-17.332742.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T19-29-17.332742.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_01T19_29_17.332742", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T19-29-17.332742.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T19-29-17.332742.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_01T19_29_17.332742", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-01T19-29-17.332742.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-01T19-29-17.332742.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_01T19_29_17.332742", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T19-29-17.332742.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T19-29-17.332742.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_01T19_29_17.332742", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-01T19-29-17.332742.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-01T19-29-17.332742.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_01T19_29_17.332742", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T19-29-17.332742.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T19-29-17.332742.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_01T19_29_17.332742", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T19-29-17.332742.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T19-29-17.332742.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_01T19_29_17.332742", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T19-29-17.332742.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T19-29-17.332742.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_01T19_29_17.332742", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-01T19-29-17.332742.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-01T19-29-17.332742.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_01T19_29_17.332742", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-01T19-29-17.332742.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-01T19-29-17.332742.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_01T19_29_17.332742", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T19-29-17.332742.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T19-29-17.332742.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_01T19_29_17.332742", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T19-29-17.332742.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T19-29-17.332742.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_01T19_29_17.332742", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T19-29-17.332742.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T19-29-17.332742.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_01T19_29_17.332742", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T19-29-17.332742.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T19-29-17.332742.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_01T19_29_17.332742", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-01T19-29-17.332742.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-01T19-29-17.332742.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_01T19_29_17.332742", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-01T19-29-17.332742.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-01T19-29-17.332742.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_01T19_29_17.332742", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-01T19-29-17.332742.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-01T19-29-17.332742.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_01T19_29_17.332742", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T19-29-17.332742.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T19-29-17.332742.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_01T19_29_17.332742", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-01T19-29-17.332742.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-01T19-29-17.332742.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_01T19_29_17.332742", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T19-29-17.332742.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T19-29-17.332742.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_01T19_29_17.332742", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T19-29-17.332742.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T19-29-17.332742.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_01T19_29_17.332742", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-01T19-29-17.332742.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-01T19-29-17.332742.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_01T19_29_17.332742", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-01T19-29-17.332742.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-01T19-29-17.332742.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_01T19_29_17.332742", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-01T19-29-17.332742.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-01T19-29-17.332742.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_01T19_29_17.332742", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T19-29-17.332742.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T19-29-17.332742.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_01T19_29_17.332742", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-01T19-29-17.332742.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-01T19-29-17.332742.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_01T19_29_17.332742", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-01T19-29-17.332742.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-01T19-29-17.332742.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_01T19_29_17.332742", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-01T19-29-17.332742.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-01T19-29-17.332742.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_01T19_29_17.332742", "path": ["**/details_harness|winogrande|5_2024-02-01T19-29-17.332742.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-01T19-29-17.332742.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_01T19_29_17.332742", "path": ["results_2024-02-01T19-29-17.332742.parquet"]}, {"split": "latest", "path": ["results_2024-02-01T19-29-17.332742.parquet"]}]}]}
2024-02-01T19:32:02+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of ibivibiv/aegolius-acadicus-v1-30b Dataset automatically created during the evaluation run of model ibivibiv/aegolius-acadicus-v1-30b on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-02-01T19:29:17.332742(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of ibivibiv/aegolius-acadicus-v1-30b\n\n\n\nDataset automatically created during the evaluation run of model ibivibiv/aegolius-acadicus-v1-30b on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-02-01T19:29:17.332742(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of ibivibiv/aegolius-acadicus-v1-30b\n\n\n\nDataset automatically created during the evaluation run of model ibivibiv/aegolius-acadicus-v1-30b on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-02-01T19:29:17.332742(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
76ebcaa3378a035f08efe256836a719edca98f0a
# Dataset Card for Evaluation run of macadeliccc/SOLAR-10.7b-Instruct-truthy-dpo <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [macadeliccc/SOLAR-10.7b-Instruct-truthy-dpo](https://huggingface.co/macadeliccc/SOLAR-10.7b-Instruct-truthy-dpo) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_macadeliccc__SOLAR-10.7b-Instruct-truthy-dpo", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-02-01T19:40:32.178744](https://huggingface.co/datasets/open-llm-leaderboard/details_macadeliccc__SOLAR-10.7b-Instruct-truthy-dpo/blob/main/results_2024-02-01T19-40-32.178744.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6578797312429587, "acc_stderr": 0.03193533127801232, "acc_norm": 0.6595146193672972, "acc_norm_stderr": 0.03257985736954445, "mc1": 0.6046511627906976, "mc1_stderr": 0.017115815632418208, "mc2": 0.7675318116403941, "mc2_stderr": 0.01417571671037387 }, "harness|arc:challenge|25": { "acc": 0.6953924914675768, "acc_stderr": 0.013449522109932487, "acc_norm": 0.7209897610921502, "acc_norm_stderr": 0.01310678488360134 }, "harness|hellaswag|10": { "acc": 0.710017924716192, "acc_stderr": 0.004528264116475881, "acc_norm": 0.8843855805616411, "acc_norm_stderr": 0.003191084792793155 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.36, "acc_stderr": 0.04824181513244218, "acc_norm": 0.36, "acc_norm_stderr": 0.04824181513244218 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.6, "acc_stderr": 0.04232073695151589, "acc_norm": 0.6, "acc_norm_stderr": 0.04232073695151589 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.743421052631579, "acc_stderr": 0.0355418036802569, "acc_norm": 0.743421052631579, "acc_norm_stderr": 0.0355418036802569 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.76, "acc_stderr": 0.04292346959909282, "acc_norm": 0.76, "acc_norm_stderr": 0.04292346959909282 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.6679245283018868, "acc_stderr": 0.02898545565233439, "acc_norm": 0.6679245283018868, "acc_norm_stderr": 0.02898545565233439 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.7708333333333334, "acc_stderr": 0.03514697467862388, "acc_norm": 0.7708333333333334, "acc_norm_stderr": 0.03514697467862388 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.44, "acc_stderr": 0.04988876515698589, "acc_norm": 0.44, "acc_norm_stderr": 0.04988876515698589 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.49, "acc_stderr": 0.05024183937956912, "acc_norm": 0.49, "acc_norm_stderr": 0.05024183937956912 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.3, "acc_stderr": 0.046056618647183814, "acc_norm": 0.3, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6705202312138728, "acc_stderr": 0.03583901754736412, "acc_norm": 0.6705202312138728, "acc_norm_stderr": 0.03583901754736412 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.35294117647058826, "acc_stderr": 0.04755129616062946, "acc_norm": 0.35294117647058826, "acc_norm_stderr": 0.04755129616062946 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.74, "acc_stderr": 0.04408440022768077, "acc_norm": 0.74, "acc_norm_stderr": 0.04408440022768077 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.6127659574468085, "acc_stderr": 0.03184389265339526, "acc_norm": 0.6127659574468085, "acc_norm_stderr": 0.03184389265339526 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.49122807017543857, "acc_stderr": 0.04702880432049615, "acc_norm": 0.49122807017543857, "acc_norm_stderr": 0.04702880432049615 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.6068965517241379, "acc_stderr": 0.040703290137070705, "acc_norm": 0.6068965517241379, "acc_norm_stderr": 0.040703290137070705 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.48412698412698413, "acc_stderr": 0.025738330639412152, "acc_norm": 0.48412698412698413, "acc_norm_stderr": 0.025738330639412152 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.4365079365079365, "acc_stderr": 0.04435932892851466, "acc_norm": 0.4365079365079365, "acc_norm_stderr": 0.04435932892851466 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.35, "acc_stderr": 0.047937248544110196, "acc_norm": 0.35, "acc_norm_stderr": 0.047937248544110196 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.8032258064516129, "acc_stderr": 0.022616409420742025, "acc_norm": 0.8032258064516129, "acc_norm_stderr": 0.022616409420742025 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.4975369458128079, "acc_stderr": 0.035179450386910616, "acc_norm": 0.4975369458128079, "acc_norm_stderr": 0.035179450386910616 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.7, "acc_stderr": 0.046056618647183814, "acc_norm": 0.7, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.793939393939394, "acc_stderr": 0.03158415324047711, "acc_norm": 0.793939393939394, "acc_norm_stderr": 0.03158415324047711 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.8484848484848485, "acc_stderr": 0.025545650426603627, "acc_norm": 0.8484848484848485, "acc_norm_stderr": 0.025545650426603627 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.8860103626943006, "acc_stderr": 0.02293514405391943, "acc_norm": 0.8860103626943006, "acc_norm_stderr": 0.02293514405391943 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.6641025641025641, "acc_stderr": 0.023946724741563976, "acc_norm": 0.6641025641025641, "acc_norm_stderr": 0.023946724741563976 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.36666666666666664, "acc_stderr": 0.029381620726465073, "acc_norm": 0.36666666666666664, "acc_norm_stderr": 0.029381620726465073 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.680672268907563, "acc_stderr": 0.0302839955258844, "acc_norm": 0.680672268907563, "acc_norm_stderr": 0.0302839955258844 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.3973509933774834, "acc_stderr": 0.039955240076816806, "acc_norm": 0.3973509933774834, "acc_norm_stderr": 0.039955240076816806 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8366972477064221, "acc_stderr": 0.01584825580650159, "acc_norm": 0.8366972477064221, "acc_norm_stderr": 0.01584825580650159 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.5694444444444444, "acc_stderr": 0.03376922151252335, "acc_norm": 0.5694444444444444, "acc_norm_stderr": 0.03376922151252335 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.8382352941176471, "acc_stderr": 0.025845017986926917, "acc_norm": 0.8382352941176471, "acc_norm_stderr": 0.025845017986926917 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.8354430379746836, "acc_stderr": 0.024135736240566932, "acc_norm": 0.8354430379746836, "acc_norm_stderr": 0.024135736240566932 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.695067264573991, "acc_stderr": 0.030898610882477515, "acc_norm": 0.695067264573991, "acc_norm_stderr": 0.030898610882477515 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.7251908396946565, "acc_stderr": 0.039153454088478354, "acc_norm": 0.7251908396946565, "acc_norm_stderr": 0.039153454088478354 }, "harness|hendrycksTest-international_law|5": { "acc": 0.8016528925619835, "acc_stderr": 0.036401182719909456, "acc_norm": 0.8016528925619835, "acc_norm_stderr": 0.036401182719909456 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7777777777777778, "acc_stderr": 0.040191074725573483, "acc_norm": 0.7777777777777778, "acc_norm_stderr": 0.040191074725573483 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7423312883435583, "acc_stderr": 0.03436150827846917, "acc_norm": 0.7423312883435583, "acc_norm_stderr": 0.03436150827846917 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.49107142857142855, "acc_stderr": 0.04745033255489123, "acc_norm": 0.49107142857142855, "acc_norm_stderr": 0.04745033255489123 }, "harness|hendrycksTest-management|5": { "acc": 0.8543689320388349, "acc_stderr": 0.034926064766237906, "acc_norm": 0.8543689320388349, "acc_norm_stderr": 0.034926064766237906 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8418803418803419, "acc_stderr": 0.023902325549560403, "acc_norm": 0.8418803418803419, "acc_norm_stderr": 0.023902325549560403 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.72, "acc_stderr": 0.045126085985421276, "acc_norm": 0.72, "acc_norm_stderr": 0.045126085985421276 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8084291187739464, "acc_stderr": 0.014072859310451949, "acc_norm": 0.8084291187739464, "acc_norm_stderr": 0.014072859310451949 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.7485549132947977, "acc_stderr": 0.023357365785874037, "acc_norm": 0.7485549132947977, "acc_norm_stderr": 0.023357365785874037 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.41787709497206704, "acc_stderr": 0.016495400635820084, "acc_norm": 0.41787709497206704, "acc_norm_stderr": 0.016495400635820084 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.7516339869281046, "acc_stderr": 0.02473998135511359, "acc_norm": 0.7516339869281046, "acc_norm_stderr": 0.02473998135511359 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.6977491961414791, "acc_stderr": 0.02608270069539966, "acc_norm": 0.6977491961414791, "acc_norm_stderr": 0.02608270069539966 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.7808641975308642, "acc_stderr": 0.023016705640262196, "acc_norm": 0.7808641975308642, "acc_norm_stderr": 0.023016705640262196 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.5106382978723404, "acc_stderr": 0.02982074719142244, "acc_norm": 0.5106382978723404, "acc_norm_stderr": 0.02982074719142244 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.48891786179921776, "acc_stderr": 0.012767098998525843, "acc_norm": 0.48891786179921776, "acc_norm_stderr": 0.012767098998525843 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.7426470588235294, "acc_stderr": 0.02655651947004151, "acc_norm": 0.7426470588235294, "acc_norm_stderr": 0.02655651947004151 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6830065359477124, "acc_stderr": 0.018824219512706207, "acc_norm": 0.6830065359477124, "acc_norm_stderr": 0.018824219512706207 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6818181818181818, "acc_stderr": 0.04461272175910509, "acc_norm": 0.6818181818181818, "acc_norm_stderr": 0.04461272175910509 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.746938775510204, "acc_stderr": 0.027833023871399683, "acc_norm": 0.746938775510204, "acc_norm_stderr": 0.027833023871399683 }, "harness|hendrycksTest-sociology|5": { "acc": 0.8159203980099502, "acc_stderr": 0.027403859410786845, "acc_norm": 0.8159203980099502, "acc_norm_stderr": 0.027403859410786845 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.86, "acc_stderr": 0.03487350880197768, "acc_norm": 0.86, "acc_norm_stderr": 0.03487350880197768 }, "harness|hendrycksTest-virology|5": { "acc": 0.572289156626506, "acc_stderr": 0.03851597683718533, "acc_norm": 0.572289156626506, "acc_norm_stderr": 0.03851597683718533 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.7543859649122807, "acc_stderr": 0.0330140594698725, "acc_norm": 0.7543859649122807, "acc_norm_stderr": 0.0330140594698725 }, "harness|truthfulqa:mc|0": { "mc1": 0.6046511627906976, "mc1_stderr": 0.017115815632418208, "mc2": 0.7675318116403941, "mc2_stderr": 0.01417571671037387 }, "harness|winogrande|5": { "acc": 0.8271507498026835, "acc_stderr": 0.010626964529971862 }, "harness|gsm8k|5": { "acc": 0.5921152388172858, "acc_stderr": 0.01353674207564309 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_macadeliccc__SOLAR-10.7b-Instruct-truthy-dpo
[ "region:us" ]
2024-02-01T19:42:52+00:00
{"pretty_name": "Evaluation run of macadeliccc/SOLAR-10.7b-Instruct-truthy-dpo", "dataset_summary": "Dataset automatically created during the evaluation run of model [macadeliccc/SOLAR-10.7b-Instruct-truthy-dpo](https://huggingface.co/macadeliccc/SOLAR-10.7b-Instruct-truthy-dpo) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_macadeliccc__SOLAR-10.7b-Instruct-truthy-dpo\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-01T19:40:32.178744](https://huggingface.co/datasets/open-llm-leaderboard/details_macadeliccc__SOLAR-10.7b-Instruct-truthy-dpo/blob/main/results_2024-02-01T19-40-32.178744.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6578797312429587,\n \"acc_stderr\": 0.03193533127801232,\n \"acc_norm\": 0.6595146193672972,\n \"acc_norm_stderr\": 0.03257985736954445,\n \"mc1\": 0.6046511627906976,\n \"mc1_stderr\": 0.017115815632418208,\n \"mc2\": 0.7675318116403941,\n \"mc2_stderr\": 0.01417571671037387\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6953924914675768,\n \"acc_stderr\": 0.013449522109932487,\n \"acc_norm\": 0.7209897610921502,\n \"acc_norm_stderr\": 0.01310678488360134\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.710017924716192,\n \"acc_stderr\": 0.004528264116475881,\n \"acc_norm\": 0.8843855805616411,\n \"acc_norm_stderr\": 0.003191084792793155\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.04232073695151589,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.04232073695151589\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.743421052631579,\n \"acc_stderr\": 0.0355418036802569,\n \"acc_norm\": 0.743421052631579,\n \"acc_norm_stderr\": 0.0355418036802569\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.04292346959909282,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.04292346959909282\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6679245283018868,\n \"acc_stderr\": 0.02898545565233439,\n \"acc_norm\": 0.6679245283018868,\n \"acc_norm_stderr\": 0.02898545565233439\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7708333333333334,\n \"acc_stderr\": 0.03514697467862388,\n \"acc_norm\": 0.7708333333333334,\n \"acc_norm_stderr\": 0.03514697467862388\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6705202312138728,\n \"acc_stderr\": 0.03583901754736412,\n \"acc_norm\": 0.6705202312138728,\n \"acc_norm_stderr\": 0.03583901754736412\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.35294117647058826,\n \"acc_stderr\": 0.04755129616062946,\n \"acc_norm\": 0.35294117647058826,\n \"acc_norm_stderr\": 0.04755129616062946\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768077,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768077\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.6127659574468085,\n \"acc_stderr\": 0.03184389265339526,\n \"acc_norm\": 0.6127659574468085,\n \"acc_norm_stderr\": 0.03184389265339526\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.49122807017543857,\n \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.49122807017543857,\n \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.6068965517241379,\n \"acc_stderr\": 0.040703290137070705,\n \"acc_norm\": 0.6068965517241379,\n \"acc_norm_stderr\": 0.040703290137070705\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.48412698412698413,\n \"acc_stderr\": 0.025738330639412152,\n \"acc_norm\": 0.48412698412698413,\n \"acc_norm_stderr\": 0.025738330639412152\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4365079365079365,\n \"acc_stderr\": 0.04435932892851466,\n \"acc_norm\": 0.4365079365079365,\n \"acc_norm_stderr\": 0.04435932892851466\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8032258064516129,\n \"acc_stderr\": 0.022616409420742025,\n \"acc_norm\": 0.8032258064516129,\n \"acc_norm_stderr\": 0.022616409420742025\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4975369458128079,\n \"acc_stderr\": 0.035179450386910616,\n \"acc_norm\": 0.4975369458128079,\n \"acc_norm_stderr\": 0.035179450386910616\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.793939393939394,\n \"acc_stderr\": 0.03158415324047711,\n \"acc_norm\": 0.793939393939394,\n \"acc_norm_stderr\": 0.03158415324047711\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.8484848484848485,\n \"acc_stderr\": 0.025545650426603627,\n \"acc_norm\": 0.8484848484848485,\n \"acc_norm_stderr\": 0.025545650426603627\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8860103626943006,\n \"acc_stderr\": 0.02293514405391943,\n \"acc_norm\": 0.8860103626943006,\n \"acc_norm_stderr\": 0.02293514405391943\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6641025641025641,\n \"acc_stderr\": 0.023946724741563976,\n \"acc_norm\": 0.6641025641025641,\n \"acc_norm_stderr\": 0.023946724741563976\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.36666666666666664,\n \"acc_stderr\": 0.029381620726465073,\n \"acc_norm\": 0.36666666666666664,\n \"acc_norm_stderr\": 0.029381620726465073\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.680672268907563,\n \"acc_stderr\": 0.0302839955258844,\n \"acc_norm\": 0.680672268907563,\n \"acc_norm_stderr\": 0.0302839955258844\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3973509933774834,\n \"acc_stderr\": 0.039955240076816806,\n \"acc_norm\": 0.3973509933774834,\n \"acc_norm_stderr\": 0.039955240076816806\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8366972477064221,\n \"acc_stderr\": 0.01584825580650159,\n \"acc_norm\": 0.8366972477064221,\n \"acc_norm_stderr\": 0.01584825580650159\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5694444444444444,\n \"acc_stderr\": 0.03376922151252335,\n \"acc_norm\": 0.5694444444444444,\n \"acc_norm_stderr\": 0.03376922151252335\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8382352941176471,\n \"acc_stderr\": 0.025845017986926917,\n \"acc_norm\": 0.8382352941176471,\n \"acc_norm_stderr\": 0.025845017986926917\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8354430379746836,\n \"acc_stderr\": 0.024135736240566932,\n \"acc_norm\": 0.8354430379746836,\n \"acc_norm_stderr\": 0.024135736240566932\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.695067264573991,\n \"acc_stderr\": 0.030898610882477515,\n \"acc_norm\": 0.695067264573991,\n \"acc_norm_stderr\": 0.030898610882477515\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7251908396946565,\n \"acc_stderr\": 0.039153454088478354,\n \"acc_norm\": 0.7251908396946565,\n \"acc_norm_stderr\": 0.039153454088478354\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8016528925619835,\n \"acc_stderr\": 0.036401182719909456,\n \"acc_norm\": 0.8016528925619835,\n \"acc_norm_stderr\": 0.036401182719909456\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.040191074725573483,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.040191074725573483\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7423312883435583,\n \"acc_stderr\": 0.03436150827846917,\n \"acc_norm\": 0.7423312883435583,\n \"acc_norm_stderr\": 0.03436150827846917\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.49107142857142855,\n \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.49107142857142855,\n \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8543689320388349,\n \"acc_stderr\": 0.034926064766237906,\n \"acc_norm\": 0.8543689320388349,\n \"acc_norm_stderr\": 0.034926064766237906\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8418803418803419,\n \"acc_stderr\": 0.023902325549560403,\n \"acc_norm\": 0.8418803418803419,\n \"acc_norm_stderr\": 0.023902325549560403\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8084291187739464,\n \"acc_stderr\": 0.014072859310451949,\n \"acc_norm\": 0.8084291187739464,\n \"acc_norm_stderr\": 0.014072859310451949\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7485549132947977,\n \"acc_stderr\": 0.023357365785874037,\n \"acc_norm\": 0.7485549132947977,\n \"acc_norm_stderr\": 0.023357365785874037\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.41787709497206704,\n \"acc_stderr\": 0.016495400635820084,\n \"acc_norm\": 0.41787709497206704,\n \"acc_norm_stderr\": 0.016495400635820084\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7516339869281046,\n \"acc_stderr\": 0.02473998135511359,\n \"acc_norm\": 0.7516339869281046,\n \"acc_norm_stderr\": 0.02473998135511359\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6977491961414791,\n \"acc_stderr\": 0.02608270069539966,\n \"acc_norm\": 0.6977491961414791,\n \"acc_norm_stderr\": 0.02608270069539966\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7808641975308642,\n \"acc_stderr\": 0.023016705640262196,\n \"acc_norm\": 0.7808641975308642,\n \"acc_norm_stderr\": 0.023016705640262196\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.5106382978723404,\n \"acc_stderr\": 0.02982074719142244,\n \"acc_norm\": 0.5106382978723404,\n \"acc_norm_stderr\": 0.02982074719142244\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.48891786179921776,\n \"acc_stderr\": 0.012767098998525843,\n \"acc_norm\": 0.48891786179921776,\n \"acc_norm_stderr\": 0.012767098998525843\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.7426470588235294,\n \"acc_stderr\": 0.02655651947004151,\n \"acc_norm\": 0.7426470588235294,\n \"acc_norm_stderr\": 0.02655651947004151\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6830065359477124,\n \"acc_stderr\": 0.018824219512706207,\n \"acc_norm\": 0.6830065359477124,\n \"acc_norm_stderr\": 0.018824219512706207\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n \"acc_stderr\": 0.04461272175910509,\n \"acc_norm\": 0.6818181818181818,\n \"acc_norm_stderr\": 0.04461272175910509\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.746938775510204,\n \"acc_stderr\": 0.027833023871399683,\n \"acc_norm\": 0.746938775510204,\n \"acc_norm_stderr\": 0.027833023871399683\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8159203980099502,\n \"acc_stderr\": 0.027403859410786845,\n \"acc_norm\": 0.8159203980099502,\n \"acc_norm_stderr\": 0.027403859410786845\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.86,\n \"acc_stderr\": 0.03487350880197768,\n \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.03487350880197768\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.572289156626506,\n \"acc_stderr\": 0.03851597683718533,\n \"acc_norm\": 0.572289156626506,\n \"acc_norm_stderr\": 0.03851597683718533\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.7543859649122807,\n \"acc_stderr\": 0.0330140594698725,\n \"acc_norm\": 0.7543859649122807,\n \"acc_norm_stderr\": 0.0330140594698725\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.6046511627906976,\n \"mc1_stderr\": 0.017115815632418208,\n \"mc2\": 0.7675318116403941,\n \"mc2_stderr\": 0.01417571671037387\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8271507498026835,\n \"acc_stderr\": 0.010626964529971862\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5921152388172858,\n \"acc_stderr\": 0.01353674207564309\n }\n}\n```", "repo_url": "https://huggingface.co/macadeliccc/SOLAR-10.7b-Instruct-truthy-dpo", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_01T19_40_32.178744", "path": ["**/details_harness|arc:challenge|25_2024-02-01T19-40-32.178744.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-01T19-40-32.178744.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_01T19_40_32.178744", "path": ["**/details_harness|gsm8k|5_2024-02-01T19-40-32.178744.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-01T19-40-32.178744.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_01T19_40_32.178744", "path": ["**/details_harness|hellaswag|10_2024-02-01T19-40-32.178744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-01T19-40-32.178744.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_01T19_40_32.178744", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T19-40-32.178744.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-01T19-40-32.178744.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-01T19-40-32.178744.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T19-40-32.178744.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T19-40-32.178744.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-01T19-40-32.178744.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T19-40-32.178744.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T19-40-32.178744.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T19-40-32.178744.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T19-40-32.178744.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-01T19-40-32.178744.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-01T19-40-32.178744.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T19-40-32.178744.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-01T19-40-32.178744.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T19-40-32.178744.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T19-40-32.178744.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T19-40-32.178744.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-01T19-40-32.178744.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T19-40-32.178744.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T19-40-32.178744.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T19-40-32.178744.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T19-40-32.178744.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T19-40-32.178744.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T19-40-32.178744.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T19-40-32.178744.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T19-40-32.178744.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T19-40-32.178744.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T19-40-32.178744.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T19-40-32.178744.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T19-40-32.178744.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T19-40-32.178744.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T19-40-32.178744.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-01T19-40-32.178744.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T19-40-32.178744.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-01T19-40-32.178744.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T19-40-32.178744.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T19-40-32.178744.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T19-40-32.178744.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-01T19-40-32.178744.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-01T19-40-32.178744.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T19-40-32.178744.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T19-40-32.178744.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T19-40-32.178744.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T19-40-32.178744.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-01T19-40-32.178744.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-01T19-40-32.178744.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-01T19-40-32.178744.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T19-40-32.178744.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-01T19-40-32.178744.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T19-40-32.178744.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T19-40-32.178744.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-01T19-40-32.178744.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-01T19-40-32.178744.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-01T19-40-32.178744.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T19-40-32.178744.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-01T19-40-32.178744.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-01T19-40-32.178744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T19-40-32.178744.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-01T19-40-32.178744.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-01T19-40-32.178744.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T19-40-32.178744.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T19-40-32.178744.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-01T19-40-32.178744.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T19-40-32.178744.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T19-40-32.178744.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T19-40-32.178744.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T19-40-32.178744.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-01T19-40-32.178744.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-01T19-40-32.178744.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T19-40-32.178744.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-01T19-40-32.178744.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T19-40-32.178744.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T19-40-32.178744.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T19-40-32.178744.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-01T19-40-32.178744.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T19-40-32.178744.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T19-40-32.178744.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T19-40-32.178744.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T19-40-32.178744.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T19-40-32.178744.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T19-40-32.178744.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T19-40-32.178744.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T19-40-32.178744.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T19-40-32.178744.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T19-40-32.178744.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T19-40-32.178744.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T19-40-32.178744.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T19-40-32.178744.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T19-40-32.178744.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-01T19-40-32.178744.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T19-40-32.178744.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-01T19-40-32.178744.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T19-40-32.178744.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T19-40-32.178744.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T19-40-32.178744.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-01T19-40-32.178744.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-01T19-40-32.178744.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T19-40-32.178744.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T19-40-32.178744.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T19-40-32.178744.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T19-40-32.178744.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-01T19-40-32.178744.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-01T19-40-32.178744.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-01T19-40-32.178744.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T19-40-32.178744.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-01T19-40-32.178744.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T19-40-32.178744.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T19-40-32.178744.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-01T19-40-32.178744.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-01T19-40-32.178744.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-01T19-40-32.178744.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T19-40-32.178744.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-01T19-40-32.178744.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-01T19-40-32.178744.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_01T19_40_32.178744", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T19-40-32.178744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T19-40-32.178744.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_01T19_40_32.178744", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-01T19-40-32.178744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-01T19-40-32.178744.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_01T19_40_32.178744", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-01T19-40-32.178744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-01T19-40-32.178744.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_01T19_40_32.178744", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T19-40-32.178744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T19-40-32.178744.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_01T19_40_32.178744", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T19-40-32.178744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T19-40-32.178744.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_01T19_40_32.178744", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-01T19-40-32.178744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-01T19-40-32.178744.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_01T19_40_32.178744", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T19-40-32.178744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T19-40-32.178744.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_01T19_40_32.178744", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T19-40-32.178744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T19-40-32.178744.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_01T19_40_32.178744", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T19-40-32.178744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T19-40-32.178744.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_01T19_40_32.178744", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T19-40-32.178744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T19-40-32.178744.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_01T19_40_32.178744", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-01T19-40-32.178744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-01T19-40-32.178744.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_01T19_40_32.178744", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-01T19-40-32.178744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-01T19-40-32.178744.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_01T19_40_32.178744", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T19-40-32.178744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T19-40-32.178744.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_01T19_40_32.178744", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-01T19-40-32.178744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-01T19-40-32.178744.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_01T19_40_32.178744", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T19-40-32.178744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T19-40-32.178744.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_01T19_40_32.178744", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T19-40-32.178744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T19-40-32.178744.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_01T19_40_32.178744", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T19-40-32.178744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T19-40-32.178744.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_01T19_40_32.178744", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-01T19-40-32.178744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-01T19-40-32.178744.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_01T19_40_32.178744", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T19-40-32.178744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T19-40-32.178744.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_01T19_40_32.178744", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T19-40-32.178744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T19-40-32.178744.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_01T19_40_32.178744", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T19-40-32.178744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T19-40-32.178744.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_01T19_40_32.178744", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T19-40-32.178744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T19-40-32.178744.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_01T19_40_32.178744", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T19-40-32.178744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T19-40-32.178744.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_01T19_40_32.178744", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T19-40-32.178744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T19-40-32.178744.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_01T19_40_32.178744", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T19-40-32.178744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T19-40-32.178744.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_01T19_40_32.178744", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T19-40-32.178744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T19-40-32.178744.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_01T19_40_32.178744", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T19-40-32.178744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T19-40-32.178744.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_01T19_40_32.178744", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T19-40-32.178744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T19-40-32.178744.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_01T19_40_32.178744", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T19-40-32.178744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T19-40-32.178744.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_01T19_40_32.178744", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T19-40-32.178744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T19-40-32.178744.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_01T19_40_32.178744", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T19-40-32.178744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T19-40-32.178744.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_01T19_40_32.178744", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T19-40-32.178744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T19-40-32.178744.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_01T19_40_32.178744", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-01T19-40-32.178744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-01T19-40-32.178744.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_01T19_40_32.178744", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T19-40-32.178744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T19-40-32.178744.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_01T19_40_32.178744", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-01T19-40-32.178744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-01T19-40-32.178744.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_01T19_40_32.178744", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T19-40-32.178744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T19-40-32.178744.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_01T19_40_32.178744", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T19-40-32.178744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T19-40-32.178744.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_01T19_40_32.178744", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T19-40-32.178744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T19-40-32.178744.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_01T19_40_32.178744", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-01T19-40-32.178744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-01T19-40-32.178744.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_01T19_40_32.178744", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-01T19-40-32.178744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-01T19-40-32.178744.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_01T19_40_32.178744", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T19-40-32.178744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T19-40-32.178744.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_01T19_40_32.178744", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T19-40-32.178744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T19-40-32.178744.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_01T19_40_32.178744", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T19-40-32.178744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T19-40-32.178744.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_01T19_40_32.178744", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T19-40-32.178744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T19-40-32.178744.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_01T19_40_32.178744", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-01T19-40-32.178744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-01T19-40-32.178744.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_01T19_40_32.178744", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-01T19-40-32.178744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-01T19-40-32.178744.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_01T19_40_32.178744", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-01T19-40-32.178744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-01T19-40-32.178744.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_01T19_40_32.178744", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T19-40-32.178744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T19-40-32.178744.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_01T19_40_32.178744", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-01T19-40-32.178744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-01T19-40-32.178744.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_01T19_40_32.178744", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T19-40-32.178744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T19-40-32.178744.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_01T19_40_32.178744", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T19-40-32.178744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T19-40-32.178744.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_01T19_40_32.178744", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-01T19-40-32.178744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-01T19-40-32.178744.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_01T19_40_32.178744", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-01T19-40-32.178744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-01T19-40-32.178744.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_01T19_40_32.178744", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-01T19-40-32.178744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-01T19-40-32.178744.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_01T19_40_32.178744", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T19-40-32.178744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T19-40-32.178744.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_01T19_40_32.178744", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-01T19-40-32.178744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-01T19-40-32.178744.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_01T19_40_32.178744", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-01T19-40-32.178744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-01T19-40-32.178744.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_01T19_40_32.178744", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-01T19-40-32.178744.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-01T19-40-32.178744.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_01T19_40_32.178744", "path": ["**/details_harness|winogrande|5_2024-02-01T19-40-32.178744.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-01T19-40-32.178744.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_01T19_40_32.178744", "path": ["results_2024-02-01T19-40-32.178744.parquet"]}, {"split": "latest", "path": ["results_2024-02-01T19-40-32.178744.parquet"]}]}]}
2024-02-01T19:43:15+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of macadeliccc/SOLAR-10.7b-Instruct-truthy-dpo Dataset automatically created during the evaluation run of model macadeliccc/SOLAR-10.7b-Instruct-truthy-dpo on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-02-01T19:40:32.178744(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of macadeliccc/SOLAR-10.7b-Instruct-truthy-dpo\n\n\n\nDataset automatically created during the evaluation run of model macadeliccc/SOLAR-10.7b-Instruct-truthy-dpo on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-02-01T19:40:32.178744(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of macadeliccc/SOLAR-10.7b-Instruct-truthy-dpo\n\n\n\nDataset automatically created during the evaluation run of model macadeliccc/SOLAR-10.7b-Instruct-truthy-dpo on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-02-01T19:40:32.178744(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
7260cd4a010a55ff9a4f5eb8714bbd168bb42e60
# Dataset Card for Evaluation run of yunconglong/DARE_TIES_13B <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [yunconglong/DARE_TIES_13B](https://huggingface.co/yunconglong/DARE_TIES_13B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_yunconglong__DARE_TIES_13B", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-02-01T19:46:16.300212](https://huggingface.co/datasets/open-llm-leaderboard/details_yunconglong__DARE_TIES_13B/blob/main/results_2024-02-01T19-46-16.300212.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6515889964448247, "acc_stderr": 0.03210174624245573, "acc_norm": 0.6506057322516777, "acc_norm_stderr": 0.03278666812910722, "mc1": 0.6352509179926561, "mc1_stderr": 0.016850961061720137, "mc2": 0.7865638980237093, "mc2_stderr": 0.01379067926936144 }, "harness|arc:challenge|25": { "acc": 0.7167235494880546, "acc_stderr": 0.013167478735134575, "acc_norm": 0.7431740614334471, "acc_norm_stderr": 0.0127669237941168 }, "harness|hellaswag|10": { "acc": 0.7263493328022307, "acc_stderr": 0.00444920629592239, "acc_norm": 0.895040828520215, "acc_norm_stderr": 0.0030587440442413545 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.36, "acc_stderr": 0.04824181513244218, "acc_norm": 0.36, "acc_norm_stderr": 0.04824181513244218 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.6296296296296297, "acc_stderr": 0.041716541613545426, "acc_norm": 0.6296296296296297, "acc_norm_stderr": 0.041716541613545426 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.6776315789473685, "acc_stderr": 0.03803510248351585, "acc_norm": 0.6776315789473685, "acc_norm_stderr": 0.03803510248351585 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.64, "acc_stderr": 0.04824181513244218, "acc_norm": 0.64, "acc_norm_stderr": 0.04824181513244218 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.7132075471698113, "acc_stderr": 0.027834912527544064, "acc_norm": 0.7132075471698113, "acc_norm_stderr": 0.027834912527544064 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.7638888888888888, "acc_stderr": 0.03551446610810826, "acc_norm": 0.7638888888888888, "acc_norm_stderr": 0.03551446610810826 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.48, "acc_stderr": 0.050211673156867795, "acc_norm": 0.48, "acc_norm_stderr": 0.050211673156867795 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.55, "acc_stderr": 0.05, "acc_norm": 0.55, "acc_norm_stderr": 0.05 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.29, "acc_stderr": 0.04560480215720684, "acc_norm": 0.29, "acc_norm_stderr": 0.04560480215720684 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6647398843930635, "acc_stderr": 0.03599586301247077, "acc_norm": 0.6647398843930635, "acc_norm_stderr": 0.03599586301247077 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.4411764705882353, "acc_stderr": 0.049406356306056595, "acc_norm": 0.4411764705882353, "acc_norm_stderr": 0.049406356306056595 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.74, "acc_stderr": 0.04408440022768078, "acc_norm": 0.74, "acc_norm_stderr": 0.04408440022768078 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.5702127659574469, "acc_stderr": 0.03236214467715564, "acc_norm": 0.5702127659574469, "acc_norm_stderr": 0.03236214467715564 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.4649122807017544, "acc_stderr": 0.04692008381368909, "acc_norm": 0.4649122807017544, "acc_norm_stderr": 0.04692008381368909 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5724137931034483, "acc_stderr": 0.04122737111370332, "acc_norm": 0.5724137931034483, "acc_norm_stderr": 0.04122737111370332 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.42328042328042326, "acc_stderr": 0.02544636563440678, "acc_norm": 0.42328042328042326, "acc_norm_stderr": 0.02544636563440678 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.4365079365079365, "acc_stderr": 0.04435932892851466, "acc_norm": 0.4365079365079365, "acc_norm_stderr": 0.04435932892851466 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.32, "acc_stderr": 0.04688261722621505, "acc_norm": 0.32, "acc_norm_stderr": 0.04688261722621505 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.7870967741935484, "acc_stderr": 0.023287665127268542, "acc_norm": 0.7870967741935484, "acc_norm_stderr": 0.023287665127268542 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.49261083743842365, "acc_stderr": 0.035176035403610084, "acc_norm": 0.49261083743842365, "acc_norm_stderr": 0.035176035403610084 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.74, "acc_stderr": 0.04408440022768079, "acc_norm": 0.74, "acc_norm_stderr": 0.04408440022768079 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7878787878787878, "acc_stderr": 0.03192271569548301, "acc_norm": 0.7878787878787878, "acc_norm_stderr": 0.03192271569548301 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.7626262626262627, "acc_stderr": 0.030313710538198892, "acc_norm": 0.7626262626262627, "acc_norm_stderr": 0.030313710538198892 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.8911917098445595, "acc_stderr": 0.022473253332768766, "acc_norm": 0.8911917098445595, "acc_norm_stderr": 0.022473253332768766 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.676923076923077, "acc_stderr": 0.02371088850197057, "acc_norm": 0.676923076923077, "acc_norm_stderr": 0.02371088850197057 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.34444444444444444, "acc_stderr": 0.02897264888484427, "acc_norm": 0.34444444444444444, "acc_norm_stderr": 0.02897264888484427 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.6764705882352942, "acc_stderr": 0.030388353551886793, "acc_norm": 0.6764705882352942, "acc_norm_stderr": 0.030388353551886793 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.32450331125827814, "acc_stderr": 0.038227469376587525, "acc_norm": 0.32450331125827814, "acc_norm_stderr": 0.038227469376587525 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8440366972477065, "acc_stderr": 0.01555580271359017, "acc_norm": 0.8440366972477065, "acc_norm_stderr": 0.01555580271359017 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.5092592592592593, "acc_stderr": 0.034093869469927006, "acc_norm": 0.5092592592592593, "acc_norm_stderr": 0.034093869469927006 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.8235294117647058, "acc_stderr": 0.026756401538078962, "acc_norm": 0.8235294117647058, "acc_norm_stderr": 0.026756401538078962 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.7974683544303798, "acc_stderr": 0.026160568246601436, "acc_norm": 0.7974683544303798, "acc_norm_stderr": 0.026160568246601436 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6816143497757847, "acc_stderr": 0.03126580522513713, "acc_norm": 0.6816143497757847, "acc_norm_stderr": 0.03126580522513713 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.8015267175572519, "acc_stderr": 0.03498149385462472, "acc_norm": 0.8015267175572519, "acc_norm_stderr": 0.03498149385462472 }, "harness|hendrycksTest-international_law|5": { "acc": 0.768595041322314, "acc_stderr": 0.03849856098794088, "acc_norm": 0.768595041322314, "acc_norm_stderr": 0.03849856098794088 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7685185185185185, "acc_stderr": 0.04077494709252626, "acc_norm": 0.7685185185185185, "acc_norm_stderr": 0.04077494709252626 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7914110429447853, "acc_stderr": 0.031921934489347235, "acc_norm": 0.7914110429447853, "acc_norm_stderr": 0.031921934489347235 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.41964285714285715, "acc_stderr": 0.04684099321077106, "acc_norm": 0.41964285714285715, "acc_norm_stderr": 0.04684099321077106 }, "harness|hendrycksTest-management|5": { "acc": 0.7669902912621359, "acc_stderr": 0.04185832598928315, "acc_norm": 0.7669902912621359, "acc_norm_stderr": 0.04185832598928315 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8846153846153846, "acc_stderr": 0.02093019318517933, "acc_norm": 0.8846153846153846, "acc_norm_stderr": 0.02093019318517933 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.73, "acc_stderr": 0.0446196043338474, "acc_norm": 0.73, "acc_norm_stderr": 0.0446196043338474 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8275862068965517, "acc_stderr": 0.013507943909371802, "acc_norm": 0.8275862068965517, "acc_norm_stderr": 0.013507943909371802 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.7196531791907514, "acc_stderr": 0.024182427496577605, "acc_norm": 0.7196531791907514, "acc_norm_stderr": 0.024182427496577605 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.45363128491620114, "acc_stderr": 0.016650437588269076, "acc_norm": 0.45363128491620114, "acc_norm_stderr": 0.016650437588269076 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.7287581699346405, "acc_stderr": 0.025457756696667874, "acc_norm": 0.7287581699346405, "acc_norm_stderr": 0.025457756696667874 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.7009646302250804, "acc_stderr": 0.026003301117885135, "acc_norm": 0.7009646302250804, "acc_norm_stderr": 0.026003301117885135 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.7469135802469136, "acc_stderr": 0.024191808600712995, "acc_norm": 0.7469135802469136, "acc_norm_stderr": 0.024191808600712995 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.48936170212765956, "acc_stderr": 0.02982074719142248, "acc_norm": 0.48936170212765956, "acc_norm_stderr": 0.02982074719142248 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.47131681877444587, "acc_stderr": 0.012749206007657474, "acc_norm": 0.47131681877444587, "acc_norm_stderr": 0.012749206007657474 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.6617647058823529, "acc_stderr": 0.028739328513983572, "acc_norm": 0.6617647058823529, "acc_norm_stderr": 0.028739328513983572 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.684640522875817, "acc_stderr": 0.018798086284886887, "acc_norm": 0.684640522875817, "acc_norm_stderr": 0.018798086284886887 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6727272727272727, "acc_stderr": 0.0449429086625209, "acc_norm": 0.6727272727272727, "acc_norm_stderr": 0.0449429086625209 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.726530612244898, "acc_stderr": 0.028535560337128448, "acc_norm": 0.726530612244898, "acc_norm_stderr": 0.028535560337128448 }, "harness|hendrycksTest-sociology|5": { "acc": 0.8308457711442786, "acc_stderr": 0.02650859065623327, "acc_norm": 0.8308457711442786, "acc_norm_stderr": 0.02650859065623327 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.84, "acc_stderr": 0.03684529491774709, "acc_norm": 0.84, "acc_norm_stderr": 0.03684529491774709 }, "harness|hendrycksTest-virology|5": { "acc": 0.5421686746987951, "acc_stderr": 0.0387862677100236, "acc_norm": 0.5421686746987951, "acc_norm_stderr": 0.0387862677100236 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8421052631578947, "acc_stderr": 0.027966785859160893, "acc_norm": 0.8421052631578947, "acc_norm_stderr": 0.027966785859160893 }, "harness|truthfulqa:mc|0": { "mc1": 0.6352509179926561, "mc1_stderr": 0.016850961061720137, "mc2": 0.7865638980237093, "mc2_stderr": 0.01379067926936144 }, "harness|winogrande|5": { "acc": 0.8808208366219415, "acc_stderr": 0.009105988620006186 }, "harness|gsm8k|5": { "acc": 0.6755117513267627, "acc_stderr": 0.012896095359768114 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_yunconglong__DARE_TIES_13B
[ "region:us" ]
2024-02-01T19:48:35+00:00
{"pretty_name": "Evaluation run of yunconglong/DARE_TIES_13B", "dataset_summary": "Dataset automatically created during the evaluation run of model [yunconglong/DARE_TIES_13B](https://huggingface.co/yunconglong/DARE_TIES_13B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_yunconglong__DARE_TIES_13B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-01T19:46:16.300212](https://huggingface.co/datasets/open-llm-leaderboard/details_yunconglong__DARE_TIES_13B/blob/main/results_2024-02-01T19-46-16.300212.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6515889964448247,\n \"acc_stderr\": 0.03210174624245573,\n \"acc_norm\": 0.6506057322516777,\n \"acc_norm_stderr\": 0.03278666812910722,\n \"mc1\": 0.6352509179926561,\n \"mc1_stderr\": 0.016850961061720137,\n \"mc2\": 0.7865638980237093,\n \"mc2_stderr\": 0.01379067926936144\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.7167235494880546,\n \"acc_stderr\": 0.013167478735134575,\n \"acc_norm\": 0.7431740614334471,\n \"acc_norm_stderr\": 0.0127669237941168\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7263493328022307,\n \"acc_stderr\": 0.00444920629592239,\n \"acc_norm\": 0.895040828520215,\n \"acc_norm_stderr\": 0.0030587440442413545\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6296296296296297,\n \"acc_stderr\": 0.041716541613545426,\n \"acc_norm\": 0.6296296296296297,\n \"acc_norm_stderr\": 0.041716541613545426\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6776315789473685,\n \"acc_stderr\": 0.03803510248351585,\n \"acc_norm\": 0.6776315789473685,\n \"acc_norm_stderr\": 0.03803510248351585\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7132075471698113,\n \"acc_stderr\": 0.027834912527544064,\n \"acc_norm\": 0.7132075471698113,\n \"acc_norm_stderr\": 0.027834912527544064\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7638888888888888,\n \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.7638888888888888,\n \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6647398843930635,\n \"acc_stderr\": 0.03599586301247077,\n \"acc_norm\": 0.6647398843930635,\n \"acc_norm_stderr\": 0.03599586301247077\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4411764705882353,\n \"acc_stderr\": 0.049406356306056595,\n \"acc_norm\": 0.4411764705882353,\n \"acc_norm_stderr\": 0.049406356306056595\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5702127659574469,\n \"acc_stderr\": 0.03236214467715564,\n \"acc_norm\": 0.5702127659574469,\n \"acc_norm_stderr\": 0.03236214467715564\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4649122807017544,\n \"acc_stderr\": 0.04692008381368909,\n \"acc_norm\": 0.4649122807017544,\n \"acc_norm_stderr\": 0.04692008381368909\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5724137931034483,\n \"acc_stderr\": 0.04122737111370332,\n \"acc_norm\": 0.5724137931034483,\n \"acc_norm_stderr\": 0.04122737111370332\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.42328042328042326,\n \"acc_stderr\": 0.02544636563440678,\n \"acc_norm\": 0.42328042328042326,\n \"acc_norm_stderr\": 0.02544636563440678\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4365079365079365,\n \"acc_stderr\": 0.04435932892851466,\n \"acc_norm\": 0.4365079365079365,\n \"acc_norm_stderr\": 0.04435932892851466\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7870967741935484,\n \"acc_stderr\": 0.023287665127268542,\n \"acc_norm\": 0.7870967741935484,\n \"acc_norm_stderr\": 0.023287665127268542\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.49261083743842365,\n \"acc_stderr\": 0.035176035403610084,\n \"acc_norm\": 0.49261083743842365,\n \"acc_norm_stderr\": 0.035176035403610084\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768079,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768079\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.03192271569548301,\n \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.03192271569548301\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7626262626262627,\n \"acc_stderr\": 0.030313710538198892,\n \"acc_norm\": 0.7626262626262627,\n \"acc_norm_stderr\": 0.030313710538198892\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8911917098445595,\n \"acc_stderr\": 0.022473253332768766,\n \"acc_norm\": 0.8911917098445595,\n \"acc_norm_stderr\": 0.022473253332768766\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.676923076923077,\n \"acc_stderr\": 0.02371088850197057,\n \"acc_norm\": 0.676923076923077,\n \"acc_norm_stderr\": 0.02371088850197057\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.34444444444444444,\n \"acc_stderr\": 0.02897264888484427,\n \"acc_norm\": 0.34444444444444444,\n \"acc_norm_stderr\": 0.02897264888484427\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.030388353551886793,\n \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.030388353551886793\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.32450331125827814,\n \"acc_stderr\": 0.038227469376587525,\n \"acc_norm\": 0.32450331125827814,\n \"acc_norm_stderr\": 0.038227469376587525\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8440366972477065,\n \"acc_stderr\": 0.01555580271359017,\n \"acc_norm\": 0.8440366972477065,\n \"acc_norm_stderr\": 0.01555580271359017\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5092592592592593,\n \"acc_stderr\": 0.034093869469927006,\n \"acc_norm\": 0.5092592592592593,\n \"acc_norm_stderr\": 0.034093869469927006\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8235294117647058,\n \"acc_stderr\": 0.026756401538078962,\n \"acc_norm\": 0.8235294117647058,\n \"acc_norm_stderr\": 0.026756401538078962\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7974683544303798,\n \"acc_stderr\": 0.026160568246601436,\n \"acc_norm\": 0.7974683544303798,\n \"acc_norm_stderr\": 0.026160568246601436\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6816143497757847,\n \"acc_stderr\": 0.03126580522513713,\n \"acc_norm\": 0.6816143497757847,\n \"acc_norm_stderr\": 0.03126580522513713\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8015267175572519,\n \"acc_stderr\": 0.03498149385462472,\n \"acc_norm\": 0.8015267175572519,\n \"acc_norm_stderr\": 0.03498149385462472\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.768595041322314,\n \"acc_stderr\": 0.03849856098794088,\n \"acc_norm\": 0.768595041322314,\n \"acc_norm_stderr\": 0.03849856098794088\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n \"acc_stderr\": 0.04077494709252626,\n \"acc_norm\": 0.7685185185185185,\n \"acc_norm_stderr\": 0.04077494709252626\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7914110429447853,\n \"acc_stderr\": 0.031921934489347235,\n \"acc_norm\": 0.7914110429447853,\n \"acc_norm_stderr\": 0.031921934489347235\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.41964285714285715,\n \"acc_stderr\": 0.04684099321077106,\n \"acc_norm\": 0.41964285714285715,\n \"acc_norm_stderr\": 0.04684099321077106\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8846153846153846,\n \"acc_stderr\": 0.02093019318517933,\n \"acc_norm\": 0.8846153846153846,\n \"acc_norm_stderr\": 0.02093019318517933\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.73,\n \"acc_stderr\": 0.0446196043338474,\n \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8275862068965517,\n \"acc_stderr\": 0.013507943909371802,\n \"acc_norm\": 0.8275862068965517,\n \"acc_norm_stderr\": 0.013507943909371802\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7196531791907514,\n \"acc_stderr\": 0.024182427496577605,\n \"acc_norm\": 0.7196531791907514,\n \"acc_norm_stderr\": 0.024182427496577605\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.45363128491620114,\n \"acc_stderr\": 0.016650437588269076,\n \"acc_norm\": 0.45363128491620114,\n \"acc_norm_stderr\": 0.016650437588269076\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7287581699346405,\n \"acc_stderr\": 0.025457756696667874,\n \"acc_norm\": 0.7287581699346405,\n \"acc_norm_stderr\": 0.025457756696667874\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7009646302250804,\n \"acc_stderr\": 0.026003301117885135,\n \"acc_norm\": 0.7009646302250804,\n \"acc_norm_stderr\": 0.026003301117885135\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7469135802469136,\n \"acc_stderr\": 0.024191808600712995,\n \"acc_norm\": 0.7469135802469136,\n \"acc_norm_stderr\": 0.024191808600712995\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.48936170212765956,\n \"acc_stderr\": 0.02982074719142248,\n \"acc_norm\": 0.48936170212765956,\n \"acc_norm_stderr\": 0.02982074719142248\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.47131681877444587,\n \"acc_stderr\": 0.012749206007657474,\n \"acc_norm\": 0.47131681877444587,\n \"acc_norm_stderr\": 0.012749206007657474\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6617647058823529,\n \"acc_stderr\": 0.028739328513983572,\n \"acc_norm\": 0.6617647058823529,\n \"acc_norm_stderr\": 0.028739328513983572\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.684640522875817,\n \"acc_stderr\": 0.018798086284886887,\n \"acc_norm\": 0.684640522875817,\n \"acc_norm_stderr\": 0.018798086284886887\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.726530612244898,\n \"acc_stderr\": 0.028535560337128448,\n \"acc_norm\": 0.726530612244898,\n \"acc_norm_stderr\": 0.028535560337128448\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8308457711442786,\n \"acc_stderr\": 0.02650859065623327,\n \"acc_norm\": 0.8308457711442786,\n \"acc_norm_stderr\": 0.02650859065623327\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774709,\n \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774709\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5421686746987951,\n \"acc_stderr\": 0.0387862677100236,\n \"acc_norm\": 0.5421686746987951,\n \"acc_norm_stderr\": 0.0387862677100236\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8421052631578947,\n \"acc_stderr\": 0.027966785859160893,\n \"acc_norm\": 0.8421052631578947,\n \"acc_norm_stderr\": 0.027966785859160893\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.6352509179926561,\n \"mc1_stderr\": 0.016850961061720137,\n \"mc2\": 0.7865638980237093,\n \"mc2_stderr\": 0.01379067926936144\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8808208366219415,\n \"acc_stderr\": 0.009105988620006186\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6755117513267627,\n \"acc_stderr\": 0.012896095359768114\n }\n}\n```", "repo_url": "https://huggingface.co/yunconglong/DARE_TIES_13B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_01T19_46_16.300212", "path": ["**/details_harness|arc:challenge|25_2024-02-01T19-46-16.300212.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-01T19-46-16.300212.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_01T19_46_16.300212", "path": ["**/details_harness|gsm8k|5_2024-02-01T19-46-16.300212.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-01T19-46-16.300212.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_01T19_46_16.300212", "path": ["**/details_harness|hellaswag|10_2024-02-01T19-46-16.300212.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-01T19-46-16.300212.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_01T19_46_16.300212", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T19-46-16.300212.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-01T19-46-16.300212.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-01T19-46-16.300212.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T19-46-16.300212.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T19-46-16.300212.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-01T19-46-16.300212.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T19-46-16.300212.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T19-46-16.300212.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T19-46-16.300212.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T19-46-16.300212.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-01T19-46-16.300212.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-01T19-46-16.300212.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T19-46-16.300212.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-01T19-46-16.300212.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T19-46-16.300212.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T19-46-16.300212.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T19-46-16.300212.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-01T19-46-16.300212.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T19-46-16.300212.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T19-46-16.300212.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T19-46-16.300212.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T19-46-16.300212.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T19-46-16.300212.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T19-46-16.300212.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T19-46-16.300212.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T19-46-16.300212.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T19-46-16.300212.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T19-46-16.300212.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T19-46-16.300212.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T19-46-16.300212.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T19-46-16.300212.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T19-46-16.300212.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-01T19-46-16.300212.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T19-46-16.300212.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-01T19-46-16.300212.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T19-46-16.300212.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T19-46-16.300212.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T19-46-16.300212.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-01T19-46-16.300212.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-01T19-46-16.300212.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T19-46-16.300212.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T19-46-16.300212.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T19-46-16.300212.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T19-46-16.300212.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-01T19-46-16.300212.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-01T19-46-16.300212.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-01T19-46-16.300212.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T19-46-16.300212.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-01T19-46-16.300212.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T19-46-16.300212.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T19-46-16.300212.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-01T19-46-16.300212.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-01T19-46-16.300212.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-01T19-46-16.300212.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T19-46-16.300212.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-01T19-46-16.300212.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-01T19-46-16.300212.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T19-46-16.300212.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-01T19-46-16.300212.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-01T19-46-16.300212.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T19-46-16.300212.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T19-46-16.300212.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-01T19-46-16.300212.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T19-46-16.300212.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T19-46-16.300212.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T19-46-16.300212.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T19-46-16.300212.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-01T19-46-16.300212.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-01T19-46-16.300212.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T19-46-16.300212.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-01T19-46-16.300212.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T19-46-16.300212.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T19-46-16.300212.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T19-46-16.300212.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-01T19-46-16.300212.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T19-46-16.300212.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T19-46-16.300212.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T19-46-16.300212.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T19-46-16.300212.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T19-46-16.300212.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T19-46-16.300212.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T19-46-16.300212.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T19-46-16.300212.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T19-46-16.300212.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T19-46-16.300212.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T19-46-16.300212.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T19-46-16.300212.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T19-46-16.300212.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T19-46-16.300212.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-01T19-46-16.300212.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T19-46-16.300212.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-01T19-46-16.300212.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T19-46-16.300212.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T19-46-16.300212.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T19-46-16.300212.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-01T19-46-16.300212.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-01T19-46-16.300212.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T19-46-16.300212.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T19-46-16.300212.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T19-46-16.300212.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T19-46-16.300212.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-01T19-46-16.300212.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-01T19-46-16.300212.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-01T19-46-16.300212.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T19-46-16.300212.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-01T19-46-16.300212.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T19-46-16.300212.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T19-46-16.300212.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-01T19-46-16.300212.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-01T19-46-16.300212.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-01T19-46-16.300212.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T19-46-16.300212.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-01T19-46-16.300212.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-01T19-46-16.300212.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_01T19_46_16.300212", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T19-46-16.300212.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T19-46-16.300212.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_01T19_46_16.300212", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-01T19-46-16.300212.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-01T19-46-16.300212.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_01T19_46_16.300212", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-01T19-46-16.300212.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-01T19-46-16.300212.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_01T19_46_16.300212", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T19-46-16.300212.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T19-46-16.300212.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_01T19_46_16.300212", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T19-46-16.300212.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T19-46-16.300212.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_01T19_46_16.300212", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-01T19-46-16.300212.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-01T19-46-16.300212.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_01T19_46_16.300212", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T19-46-16.300212.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T19-46-16.300212.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_01T19_46_16.300212", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T19-46-16.300212.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T19-46-16.300212.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_01T19_46_16.300212", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T19-46-16.300212.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T19-46-16.300212.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_01T19_46_16.300212", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T19-46-16.300212.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T19-46-16.300212.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_01T19_46_16.300212", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-01T19-46-16.300212.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-01T19-46-16.300212.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_01T19_46_16.300212", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-01T19-46-16.300212.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-01T19-46-16.300212.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_01T19_46_16.300212", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T19-46-16.300212.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T19-46-16.300212.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_01T19_46_16.300212", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-01T19-46-16.300212.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-01T19-46-16.300212.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_01T19_46_16.300212", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T19-46-16.300212.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T19-46-16.300212.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_01T19_46_16.300212", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T19-46-16.300212.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T19-46-16.300212.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_01T19_46_16.300212", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T19-46-16.300212.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T19-46-16.300212.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_01T19_46_16.300212", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-01T19-46-16.300212.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-01T19-46-16.300212.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_01T19_46_16.300212", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T19-46-16.300212.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T19-46-16.300212.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_01T19_46_16.300212", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T19-46-16.300212.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T19-46-16.300212.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_01T19_46_16.300212", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T19-46-16.300212.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T19-46-16.300212.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_01T19_46_16.300212", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T19-46-16.300212.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T19-46-16.300212.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_01T19_46_16.300212", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T19-46-16.300212.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T19-46-16.300212.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_01T19_46_16.300212", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T19-46-16.300212.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T19-46-16.300212.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_01T19_46_16.300212", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T19-46-16.300212.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T19-46-16.300212.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_01T19_46_16.300212", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T19-46-16.300212.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T19-46-16.300212.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_01T19_46_16.300212", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T19-46-16.300212.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T19-46-16.300212.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_01T19_46_16.300212", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T19-46-16.300212.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T19-46-16.300212.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_01T19_46_16.300212", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T19-46-16.300212.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T19-46-16.300212.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_01T19_46_16.300212", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T19-46-16.300212.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T19-46-16.300212.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_01T19_46_16.300212", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T19-46-16.300212.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T19-46-16.300212.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_01T19_46_16.300212", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T19-46-16.300212.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T19-46-16.300212.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_01T19_46_16.300212", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-01T19-46-16.300212.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-01T19-46-16.300212.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_01T19_46_16.300212", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T19-46-16.300212.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T19-46-16.300212.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_01T19_46_16.300212", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-01T19-46-16.300212.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-01T19-46-16.300212.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_01T19_46_16.300212", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T19-46-16.300212.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T19-46-16.300212.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_01T19_46_16.300212", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T19-46-16.300212.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T19-46-16.300212.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_01T19_46_16.300212", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T19-46-16.300212.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T19-46-16.300212.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_01T19_46_16.300212", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-01T19-46-16.300212.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-01T19-46-16.300212.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_01T19_46_16.300212", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-01T19-46-16.300212.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-01T19-46-16.300212.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_01T19_46_16.300212", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T19-46-16.300212.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T19-46-16.300212.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_01T19_46_16.300212", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T19-46-16.300212.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T19-46-16.300212.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_01T19_46_16.300212", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T19-46-16.300212.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T19-46-16.300212.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_01T19_46_16.300212", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T19-46-16.300212.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T19-46-16.300212.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_01T19_46_16.300212", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-01T19-46-16.300212.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-01T19-46-16.300212.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_01T19_46_16.300212", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-01T19-46-16.300212.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-01T19-46-16.300212.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_01T19_46_16.300212", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-01T19-46-16.300212.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-01T19-46-16.300212.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_01T19_46_16.300212", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T19-46-16.300212.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T19-46-16.300212.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_01T19_46_16.300212", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-01T19-46-16.300212.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-01T19-46-16.300212.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_01T19_46_16.300212", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T19-46-16.300212.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T19-46-16.300212.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_01T19_46_16.300212", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T19-46-16.300212.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T19-46-16.300212.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_01T19_46_16.300212", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-01T19-46-16.300212.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-01T19-46-16.300212.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_01T19_46_16.300212", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-01T19-46-16.300212.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-01T19-46-16.300212.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_01T19_46_16.300212", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-01T19-46-16.300212.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-01T19-46-16.300212.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_01T19_46_16.300212", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T19-46-16.300212.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T19-46-16.300212.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_01T19_46_16.300212", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-01T19-46-16.300212.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-01T19-46-16.300212.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_01T19_46_16.300212", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-01T19-46-16.300212.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-01T19-46-16.300212.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_01T19_46_16.300212", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-01T19-46-16.300212.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-01T19-46-16.300212.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_01T19_46_16.300212", "path": ["**/details_harness|winogrande|5_2024-02-01T19-46-16.300212.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-01T19-46-16.300212.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_01T19_46_16.300212", "path": ["results_2024-02-01T19-46-16.300212.parquet"]}, {"split": "latest", "path": ["results_2024-02-01T19-46-16.300212.parquet"]}]}]}
2024-02-01T19:49:00+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of yunconglong/DARE_TIES_13B Dataset automatically created during the evaluation run of model yunconglong/DARE_TIES_13B on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-02-01T19:46:16.300212(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of yunconglong/DARE_TIES_13B\n\n\n\nDataset automatically created during the evaluation run of model yunconglong/DARE_TIES_13B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-02-01T19:46:16.300212(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of yunconglong/DARE_TIES_13B\n\n\n\nDataset automatically created during the evaluation run of model yunconglong/DARE_TIES_13B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-02-01T19:46:16.300212(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
4f1c1af31947cdca46a378b07713b35d6a51c968
# Dataset Card for Evaluation run of FelixChao/Patronum-7B <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [FelixChao/Patronum-7B](https://huggingface.co/FelixChao/Patronum-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_FelixChao__Patronum-7B", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-02-01T20:06:23.836193](https://huggingface.co/datasets/open-llm-leaderboard/details_FelixChao__Patronum-7B/blob/main/results_2024-02-01T20-06-23.836193.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6534757286766597, "acc_stderr": 0.03192548199910663, "acc_norm": 0.6535546669851309, "acc_norm_stderr": 0.032578444148951814, "mc1": 0.5312117503059975, "mc1_stderr": 0.017469364874577526, "mc2": 0.7040594573714476, "mc2_stderr": 0.014835986297725817 }, "harness|arc:challenge|25": { "acc": 0.6868600682593856, "acc_stderr": 0.013552671543623497, "acc_norm": 0.7167235494880546, "acc_norm_stderr": 0.013167478735134573 }, "harness|hellaswag|10": { "acc": 0.711611232822147, "acc_stderr": 0.004520870679457036, "acc_norm": 0.8832901812387971, "acc_norm_stderr": 0.0032041800729423757 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.29, "acc_stderr": 0.045604802157206845, "acc_norm": 0.29, "acc_norm_stderr": 0.045604802157206845 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.6518518518518519, "acc_stderr": 0.041153246103369526, "acc_norm": 0.6518518518518519, "acc_norm_stderr": 0.041153246103369526 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.7039473684210527, "acc_stderr": 0.03715062154998904, "acc_norm": 0.7039473684210527, "acc_norm_stderr": 0.03715062154998904 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.63, "acc_stderr": 0.04852365870939099, "acc_norm": 0.63, "acc_norm_stderr": 0.04852365870939099 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.7056603773584905, "acc_stderr": 0.02804918631569525, "acc_norm": 0.7056603773584905, "acc_norm_stderr": 0.02804918631569525 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.7777777777777778, "acc_stderr": 0.03476590104304134, "acc_norm": 0.7777777777777778, "acc_norm_stderr": 0.03476590104304134 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.45, "acc_stderr": 0.05, "acc_norm": 0.45, "acc_norm_stderr": 0.05 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.54, "acc_stderr": 0.05009082659620333, "acc_norm": 0.54, "acc_norm_stderr": 0.05009082659620333 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.34, "acc_stderr": 0.04760952285695236, "acc_norm": 0.34, "acc_norm_stderr": 0.04760952285695236 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6705202312138728, "acc_stderr": 0.03583901754736412, "acc_norm": 0.6705202312138728, "acc_norm_stderr": 0.03583901754736412 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.43137254901960786, "acc_stderr": 0.04928099597287534, "acc_norm": 0.43137254901960786, "acc_norm_stderr": 0.04928099597287534 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.74, "acc_stderr": 0.04408440022768078, "acc_norm": 0.74, "acc_norm_stderr": 0.04408440022768078 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.5702127659574469, "acc_stderr": 0.03236214467715564, "acc_norm": 0.5702127659574469, "acc_norm_stderr": 0.03236214467715564 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.4824561403508772, "acc_stderr": 0.04700708033551038, "acc_norm": 0.4824561403508772, "acc_norm_stderr": 0.04700708033551038 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5655172413793104, "acc_stderr": 0.04130740879555498, "acc_norm": 0.5655172413793104, "acc_norm_stderr": 0.04130740879555498 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.42328042328042326, "acc_stderr": 0.025446365634406783, "acc_norm": 0.42328042328042326, "acc_norm_stderr": 0.025446365634406783 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.46825396825396826, "acc_stderr": 0.04463112720677171, "acc_norm": 0.46825396825396826, "acc_norm_stderr": 0.04463112720677171 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.36, "acc_stderr": 0.048241815132442176, "acc_norm": 0.36, "acc_norm_stderr": 0.048241815132442176 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.7935483870967742, "acc_stderr": 0.023025899617188723, "acc_norm": 0.7935483870967742, "acc_norm_stderr": 0.023025899617188723 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.49261083743842365, "acc_stderr": 0.035176035403610084, "acc_norm": 0.49261083743842365, "acc_norm_stderr": 0.035176035403610084 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.72, "acc_stderr": 0.04512608598542127, "acc_norm": 0.72, "acc_norm_stderr": 0.04512608598542127 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7636363636363637, "acc_stderr": 0.03317505930009182, "acc_norm": 0.7636363636363637, "acc_norm_stderr": 0.03317505930009182 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.7929292929292929, "acc_stderr": 0.02886977846026704, "acc_norm": 0.7929292929292929, "acc_norm_stderr": 0.02886977846026704 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.8963730569948186, "acc_stderr": 0.021995311963644237, "acc_norm": 0.8963730569948186, "acc_norm_stderr": 0.021995311963644237 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.6846153846153846, "acc_stderr": 0.02355964698318994, "acc_norm": 0.6846153846153846, "acc_norm_stderr": 0.02355964698318994 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.337037037037037, "acc_stderr": 0.02882088466625326, "acc_norm": 0.337037037037037, "acc_norm_stderr": 0.02882088466625326 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.6932773109243697, "acc_stderr": 0.029953823891887037, "acc_norm": 0.6932773109243697, "acc_norm_stderr": 0.029953823891887037 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.32450331125827814, "acc_stderr": 0.03822746937658752, "acc_norm": 0.32450331125827814, "acc_norm_stderr": 0.03822746937658752 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8532110091743119, "acc_stderr": 0.01517314184512625, "acc_norm": 0.8532110091743119, "acc_norm_stderr": 0.01517314184512625 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.5, "acc_stderr": 0.034099716973523674, "acc_norm": 0.5, "acc_norm_stderr": 0.034099716973523674 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.8382352941176471, "acc_stderr": 0.025845017986926917, "acc_norm": 0.8382352941176471, "acc_norm_stderr": 0.025845017986926917 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.8185654008438819, "acc_stderr": 0.025085961144579647, "acc_norm": 0.8185654008438819, "acc_norm_stderr": 0.025085961144579647 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6816143497757847, "acc_stderr": 0.03126580522513713, "acc_norm": 0.6816143497757847, "acc_norm_stderr": 0.03126580522513713 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.8091603053435115, "acc_stderr": 0.03446513350752598, "acc_norm": 0.8091603053435115, "acc_norm_stderr": 0.03446513350752598 }, "harness|hendrycksTest-international_law|5": { "acc": 0.7851239669421488, "acc_stderr": 0.037494924487096966, "acc_norm": 0.7851239669421488, "acc_norm_stderr": 0.037494924487096966 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7870370370370371, "acc_stderr": 0.0395783547198098, "acc_norm": 0.7870370370370371, "acc_norm_stderr": 0.0395783547198098 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.754601226993865, "acc_stderr": 0.03380939813943354, "acc_norm": 0.754601226993865, "acc_norm_stderr": 0.03380939813943354 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.4642857142857143, "acc_stderr": 0.04733667890053756, "acc_norm": 0.4642857142857143, "acc_norm_stderr": 0.04733667890053756 }, "harness|hendrycksTest-management|5": { "acc": 0.8155339805825242, "acc_stderr": 0.03840423627288276, "acc_norm": 0.8155339805825242, "acc_norm_stderr": 0.03840423627288276 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8803418803418803, "acc_stderr": 0.021262719400406974, "acc_norm": 0.8803418803418803, "acc_norm_stderr": 0.021262719400406974 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.74, "acc_stderr": 0.04408440022768078, "acc_norm": 0.74, "acc_norm_stderr": 0.04408440022768078 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8263090676883781, "acc_stderr": 0.01354741565866226, "acc_norm": 0.8263090676883781, "acc_norm_stderr": 0.01354741565866226 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.7254335260115607, "acc_stderr": 0.02402774515526502, "acc_norm": 0.7254335260115607, "acc_norm_stderr": 0.02402774515526502 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.38324022346368714, "acc_stderr": 0.016260159604429125, "acc_norm": 0.38324022346368714, "acc_norm_stderr": 0.016260159604429125 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.7287581699346405, "acc_stderr": 0.025457756696667885, "acc_norm": 0.7287581699346405, "acc_norm_stderr": 0.025457756696667885 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.7234726688102894, "acc_stderr": 0.02540383297817961, "acc_norm": 0.7234726688102894, "acc_norm_stderr": 0.02540383297817961 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.7530864197530864, "acc_stderr": 0.023993501709042103, "acc_norm": 0.7530864197530864, "acc_norm_stderr": 0.023993501709042103 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.4645390070921986, "acc_stderr": 0.02975238965742705, "acc_norm": 0.4645390070921986, "acc_norm_stderr": 0.02975238965742705 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.47392438070404175, "acc_stderr": 0.012752858346533126, "acc_norm": 0.47392438070404175, "acc_norm_stderr": 0.012752858346533126 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.6985294117647058, "acc_stderr": 0.027875982114273168, "acc_norm": 0.6985294117647058, "acc_norm_stderr": 0.027875982114273168 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6862745098039216, "acc_stderr": 0.018771683893528176, "acc_norm": 0.6862745098039216, "acc_norm_stderr": 0.018771683893528176 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6545454545454545, "acc_stderr": 0.04554619617541054, "acc_norm": 0.6545454545454545, "acc_norm_stderr": 0.04554619617541054 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.7142857142857143, "acc_stderr": 0.028920583220675606, "acc_norm": 0.7142857142857143, "acc_norm_stderr": 0.028920583220675606 }, "harness|hendrycksTest-sociology|5": { "acc": 0.845771144278607, "acc_stderr": 0.02553843336857833, "acc_norm": 0.845771144278607, "acc_norm_stderr": 0.02553843336857833 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.87, "acc_stderr": 0.033799766898963086, "acc_norm": 0.87, "acc_norm_stderr": 0.033799766898963086 }, "harness|hendrycksTest-virology|5": { "acc": 0.5481927710843374, "acc_stderr": 0.03874371556587953, "acc_norm": 0.5481927710843374, "acc_norm_stderr": 0.03874371556587953 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8362573099415205, "acc_stderr": 0.028380919596145866, "acc_norm": 0.8362573099415205, "acc_norm_stderr": 0.028380919596145866 }, "harness|truthfulqa:mc|0": { "mc1": 0.5312117503059975, "mc1_stderr": 0.017469364874577526, "mc2": 0.7040594573714476, "mc2_stderr": 0.014835986297725817 }, "harness|winogrande|5": { "acc": 0.8184688239936859, "acc_stderr": 0.010833276515007493 }, "harness|gsm8k|5": { "acc": 0.6853677028051555, "acc_stderr": 0.01279103722733603 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_FelixChao__Patronum-7B
[ "region:us" ]
2024-02-01T20:08:42+00:00
{"pretty_name": "Evaluation run of FelixChao/Patronum-7B", "dataset_summary": "Dataset automatically created during the evaluation run of model [FelixChao/Patronum-7B](https://huggingface.co/FelixChao/Patronum-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_FelixChao__Patronum-7B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-01T20:06:23.836193](https://huggingface.co/datasets/open-llm-leaderboard/details_FelixChao__Patronum-7B/blob/main/results_2024-02-01T20-06-23.836193.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6534757286766597,\n \"acc_stderr\": 0.03192548199910663,\n \"acc_norm\": 0.6535546669851309,\n \"acc_norm_stderr\": 0.032578444148951814,\n \"mc1\": 0.5312117503059975,\n \"mc1_stderr\": 0.017469364874577526,\n \"mc2\": 0.7040594573714476,\n \"mc2_stderr\": 0.014835986297725817\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6868600682593856,\n \"acc_stderr\": 0.013552671543623497,\n \"acc_norm\": 0.7167235494880546,\n \"acc_norm_stderr\": 0.013167478735134573\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.711611232822147,\n \"acc_stderr\": 0.004520870679457036,\n \"acc_norm\": 0.8832901812387971,\n \"acc_norm_stderr\": 0.0032041800729423757\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6518518518518519,\n \"acc_stderr\": 0.041153246103369526,\n \"acc_norm\": 0.6518518518518519,\n \"acc_norm_stderr\": 0.041153246103369526\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7039473684210527,\n \"acc_stderr\": 0.03715062154998904,\n \"acc_norm\": 0.7039473684210527,\n \"acc_norm_stderr\": 0.03715062154998904\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.63,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7056603773584905,\n \"acc_stderr\": 0.02804918631569525,\n \"acc_norm\": 0.7056603773584905,\n \"acc_norm_stderr\": 0.02804918631569525\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.03476590104304134,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.03476590104304134\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6705202312138728,\n \"acc_stderr\": 0.03583901754736412,\n \"acc_norm\": 0.6705202312138728,\n \"acc_norm_stderr\": 0.03583901754736412\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.43137254901960786,\n \"acc_stderr\": 0.04928099597287534,\n \"acc_norm\": 0.43137254901960786,\n \"acc_norm_stderr\": 0.04928099597287534\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5702127659574469,\n \"acc_stderr\": 0.03236214467715564,\n \"acc_norm\": 0.5702127659574469,\n \"acc_norm_stderr\": 0.03236214467715564\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.4824561403508772,\n \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5655172413793104,\n \"acc_stderr\": 0.04130740879555498,\n \"acc_norm\": 0.5655172413793104,\n \"acc_norm_stderr\": 0.04130740879555498\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.42328042328042326,\n \"acc_stderr\": 0.025446365634406783,\n \"acc_norm\": 0.42328042328042326,\n \"acc_norm_stderr\": 0.025446365634406783\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.46825396825396826,\n \"acc_stderr\": 0.04463112720677171,\n \"acc_norm\": 0.46825396825396826,\n \"acc_norm_stderr\": 0.04463112720677171\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7935483870967742,\n \"acc_stderr\": 0.023025899617188723,\n \"acc_norm\": 0.7935483870967742,\n \"acc_norm_stderr\": 0.023025899617188723\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.49261083743842365,\n \"acc_stderr\": 0.035176035403610084,\n \"acc_norm\": 0.49261083743842365,\n \"acc_norm_stderr\": 0.035176035403610084\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7636363636363637,\n \"acc_stderr\": 0.03317505930009182,\n \"acc_norm\": 0.7636363636363637,\n \"acc_norm_stderr\": 0.03317505930009182\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7929292929292929,\n \"acc_stderr\": 0.02886977846026704,\n \"acc_norm\": 0.7929292929292929,\n \"acc_norm_stderr\": 0.02886977846026704\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8963730569948186,\n \"acc_stderr\": 0.021995311963644237,\n \"acc_norm\": 0.8963730569948186,\n \"acc_norm_stderr\": 0.021995311963644237\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6846153846153846,\n \"acc_stderr\": 0.02355964698318994,\n \"acc_norm\": 0.6846153846153846,\n \"acc_norm_stderr\": 0.02355964698318994\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.337037037037037,\n \"acc_stderr\": 0.02882088466625326,\n \"acc_norm\": 0.337037037037037,\n \"acc_norm_stderr\": 0.02882088466625326\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6932773109243697,\n \"acc_stderr\": 0.029953823891887037,\n \"acc_norm\": 0.6932773109243697,\n \"acc_norm_stderr\": 0.029953823891887037\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.32450331125827814,\n \"acc_stderr\": 0.03822746937658752,\n \"acc_norm\": 0.32450331125827814,\n \"acc_norm_stderr\": 0.03822746937658752\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8532110091743119,\n \"acc_stderr\": 0.01517314184512625,\n \"acc_norm\": 0.8532110091743119,\n \"acc_norm_stderr\": 0.01517314184512625\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.034099716973523674,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.034099716973523674\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8382352941176471,\n \"acc_stderr\": 0.025845017986926917,\n \"acc_norm\": 0.8382352941176471,\n \"acc_norm_stderr\": 0.025845017986926917\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8185654008438819,\n \"acc_stderr\": 0.025085961144579647,\n \"acc_norm\": 0.8185654008438819,\n \"acc_norm_stderr\": 0.025085961144579647\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6816143497757847,\n \"acc_stderr\": 0.03126580522513713,\n \"acc_norm\": 0.6816143497757847,\n \"acc_norm_stderr\": 0.03126580522513713\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8091603053435115,\n \"acc_stderr\": 0.03446513350752598,\n \"acc_norm\": 0.8091603053435115,\n \"acc_norm_stderr\": 0.03446513350752598\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n \"acc_stderr\": 0.0395783547198098,\n \"acc_norm\": 0.7870370370370371,\n \"acc_norm_stderr\": 0.0395783547198098\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.754601226993865,\n \"acc_stderr\": 0.03380939813943354,\n \"acc_norm\": 0.754601226993865,\n \"acc_norm_stderr\": 0.03380939813943354\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4642857142857143,\n \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.4642857142857143,\n \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8155339805825242,\n \"acc_stderr\": 0.03840423627288276,\n \"acc_norm\": 0.8155339805825242,\n \"acc_norm_stderr\": 0.03840423627288276\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n \"acc_stderr\": 0.021262719400406974,\n \"acc_norm\": 0.8803418803418803,\n \"acc_norm_stderr\": 0.021262719400406974\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8263090676883781,\n \"acc_stderr\": 0.01354741565866226,\n \"acc_norm\": 0.8263090676883781,\n \"acc_norm_stderr\": 0.01354741565866226\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7254335260115607,\n \"acc_stderr\": 0.02402774515526502,\n \"acc_norm\": 0.7254335260115607,\n \"acc_norm_stderr\": 0.02402774515526502\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.38324022346368714,\n \"acc_stderr\": 0.016260159604429125,\n \"acc_norm\": 0.38324022346368714,\n \"acc_norm_stderr\": 0.016260159604429125\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7287581699346405,\n \"acc_stderr\": 0.025457756696667885,\n \"acc_norm\": 0.7287581699346405,\n \"acc_norm_stderr\": 0.025457756696667885\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7234726688102894,\n \"acc_stderr\": 0.02540383297817961,\n \"acc_norm\": 0.7234726688102894,\n \"acc_norm_stderr\": 0.02540383297817961\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7530864197530864,\n \"acc_stderr\": 0.023993501709042103,\n \"acc_norm\": 0.7530864197530864,\n \"acc_norm_stderr\": 0.023993501709042103\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4645390070921986,\n \"acc_stderr\": 0.02975238965742705,\n \"acc_norm\": 0.4645390070921986,\n \"acc_norm_stderr\": 0.02975238965742705\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.47392438070404175,\n \"acc_stderr\": 0.012752858346533126,\n \"acc_norm\": 0.47392438070404175,\n \"acc_norm_stderr\": 0.012752858346533126\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6985294117647058,\n \"acc_stderr\": 0.027875982114273168,\n \"acc_norm\": 0.6985294117647058,\n \"acc_norm_stderr\": 0.027875982114273168\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6862745098039216,\n \"acc_stderr\": 0.018771683893528176,\n \"acc_norm\": 0.6862745098039216,\n \"acc_norm_stderr\": 0.018771683893528176\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7142857142857143,\n \"acc_stderr\": 0.028920583220675606,\n \"acc_norm\": 0.7142857142857143,\n \"acc_norm_stderr\": 0.028920583220675606\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.845771144278607,\n \"acc_stderr\": 0.02553843336857833,\n \"acc_norm\": 0.845771144278607,\n \"acc_norm_stderr\": 0.02553843336857833\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.87,\n \"acc_stderr\": 0.033799766898963086,\n \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.033799766898963086\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5481927710843374,\n \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.5481927710843374,\n \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5312117503059975,\n \"mc1_stderr\": 0.017469364874577526,\n \"mc2\": 0.7040594573714476,\n \"mc2_stderr\": 0.014835986297725817\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8184688239936859,\n \"acc_stderr\": 0.010833276515007493\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6853677028051555,\n \"acc_stderr\": 0.01279103722733603\n }\n}\n```", "repo_url": "https://huggingface.co/FelixChao/Patronum-7B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_01T20_06_23.836193", "path": ["**/details_harness|arc:challenge|25_2024-02-01T20-06-23.836193.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-01T20-06-23.836193.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_01T20_06_23.836193", "path": ["**/details_harness|gsm8k|5_2024-02-01T20-06-23.836193.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-01T20-06-23.836193.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_01T20_06_23.836193", "path": ["**/details_harness|hellaswag|10_2024-02-01T20-06-23.836193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-01T20-06-23.836193.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_01T20_06_23.836193", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T20-06-23.836193.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-01T20-06-23.836193.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-01T20-06-23.836193.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T20-06-23.836193.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T20-06-23.836193.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-01T20-06-23.836193.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T20-06-23.836193.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T20-06-23.836193.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T20-06-23.836193.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T20-06-23.836193.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-01T20-06-23.836193.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-01T20-06-23.836193.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T20-06-23.836193.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-01T20-06-23.836193.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T20-06-23.836193.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T20-06-23.836193.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T20-06-23.836193.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-01T20-06-23.836193.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T20-06-23.836193.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T20-06-23.836193.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T20-06-23.836193.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T20-06-23.836193.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T20-06-23.836193.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T20-06-23.836193.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T20-06-23.836193.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T20-06-23.836193.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T20-06-23.836193.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T20-06-23.836193.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T20-06-23.836193.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T20-06-23.836193.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T20-06-23.836193.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T20-06-23.836193.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-01T20-06-23.836193.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T20-06-23.836193.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-01T20-06-23.836193.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T20-06-23.836193.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T20-06-23.836193.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T20-06-23.836193.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-01T20-06-23.836193.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-01T20-06-23.836193.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T20-06-23.836193.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T20-06-23.836193.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T20-06-23.836193.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T20-06-23.836193.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-01T20-06-23.836193.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-01T20-06-23.836193.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-01T20-06-23.836193.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T20-06-23.836193.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-01T20-06-23.836193.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T20-06-23.836193.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T20-06-23.836193.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-01T20-06-23.836193.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-01T20-06-23.836193.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-01T20-06-23.836193.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T20-06-23.836193.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-01T20-06-23.836193.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-01T20-06-23.836193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T20-06-23.836193.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-01T20-06-23.836193.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-01T20-06-23.836193.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T20-06-23.836193.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T20-06-23.836193.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-01T20-06-23.836193.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T20-06-23.836193.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T20-06-23.836193.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T20-06-23.836193.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T20-06-23.836193.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-01T20-06-23.836193.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-01T20-06-23.836193.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T20-06-23.836193.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-01T20-06-23.836193.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T20-06-23.836193.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T20-06-23.836193.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T20-06-23.836193.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-01T20-06-23.836193.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T20-06-23.836193.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T20-06-23.836193.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T20-06-23.836193.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T20-06-23.836193.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T20-06-23.836193.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T20-06-23.836193.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T20-06-23.836193.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T20-06-23.836193.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T20-06-23.836193.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T20-06-23.836193.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T20-06-23.836193.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T20-06-23.836193.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T20-06-23.836193.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T20-06-23.836193.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-01T20-06-23.836193.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T20-06-23.836193.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-01T20-06-23.836193.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T20-06-23.836193.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T20-06-23.836193.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T20-06-23.836193.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-01T20-06-23.836193.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-01T20-06-23.836193.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T20-06-23.836193.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T20-06-23.836193.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T20-06-23.836193.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T20-06-23.836193.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-01T20-06-23.836193.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-01T20-06-23.836193.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-01T20-06-23.836193.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T20-06-23.836193.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-01T20-06-23.836193.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T20-06-23.836193.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T20-06-23.836193.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-01T20-06-23.836193.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-01T20-06-23.836193.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-01T20-06-23.836193.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T20-06-23.836193.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-01T20-06-23.836193.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-01T20-06-23.836193.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_01T20_06_23.836193", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T20-06-23.836193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T20-06-23.836193.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_01T20_06_23.836193", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-01T20-06-23.836193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-01T20-06-23.836193.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_01T20_06_23.836193", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-01T20-06-23.836193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-01T20-06-23.836193.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_01T20_06_23.836193", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T20-06-23.836193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T20-06-23.836193.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_01T20_06_23.836193", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T20-06-23.836193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T20-06-23.836193.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_01T20_06_23.836193", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-01T20-06-23.836193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-01T20-06-23.836193.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_01T20_06_23.836193", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T20-06-23.836193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T20-06-23.836193.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_01T20_06_23.836193", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T20-06-23.836193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T20-06-23.836193.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_01T20_06_23.836193", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T20-06-23.836193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T20-06-23.836193.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_01T20_06_23.836193", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T20-06-23.836193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T20-06-23.836193.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_01T20_06_23.836193", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-01T20-06-23.836193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-01T20-06-23.836193.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_01T20_06_23.836193", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-01T20-06-23.836193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-01T20-06-23.836193.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_01T20_06_23.836193", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T20-06-23.836193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T20-06-23.836193.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_01T20_06_23.836193", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-01T20-06-23.836193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-01T20-06-23.836193.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_01T20_06_23.836193", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T20-06-23.836193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T20-06-23.836193.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_01T20_06_23.836193", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T20-06-23.836193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T20-06-23.836193.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_01T20_06_23.836193", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T20-06-23.836193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T20-06-23.836193.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_01T20_06_23.836193", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-01T20-06-23.836193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-01T20-06-23.836193.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_01T20_06_23.836193", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T20-06-23.836193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T20-06-23.836193.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_01T20_06_23.836193", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T20-06-23.836193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T20-06-23.836193.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_01T20_06_23.836193", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T20-06-23.836193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T20-06-23.836193.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_01T20_06_23.836193", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T20-06-23.836193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T20-06-23.836193.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_01T20_06_23.836193", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T20-06-23.836193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T20-06-23.836193.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_01T20_06_23.836193", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T20-06-23.836193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T20-06-23.836193.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_01T20_06_23.836193", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T20-06-23.836193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T20-06-23.836193.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_01T20_06_23.836193", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T20-06-23.836193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T20-06-23.836193.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_01T20_06_23.836193", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T20-06-23.836193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T20-06-23.836193.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_01T20_06_23.836193", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T20-06-23.836193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T20-06-23.836193.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_01T20_06_23.836193", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T20-06-23.836193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T20-06-23.836193.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_01T20_06_23.836193", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T20-06-23.836193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T20-06-23.836193.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_01T20_06_23.836193", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T20-06-23.836193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T20-06-23.836193.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_01T20_06_23.836193", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T20-06-23.836193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T20-06-23.836193.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_01T20_06_23.836193", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-01T20-06-23.836193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-01T20-06-23.836193.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_01T20_06_23.836193", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T20-06-23.836193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T20-06-23.836193.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_01T20_06_23.836193", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-01T20-06-23.836193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-01T20-06-23.836193.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_01T20_06_23.836193", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T20-06-23.836193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T20-06-23.836193.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_01T20_06_23.836193", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T20-06-23.836193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T20-06-23.836193.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_01T20_06_23.836193", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T20-06-23.836193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T20-06-23.836193.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_01T20_06_23.836193", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-01T20-06-23.836193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-01T20-06-23.836193.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_01T20_06_23.836193", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-01T20-06-23.836193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-01T20-06-23.836193.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_01T20_06_23.836193", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T20-06-23.836193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T20-06-23.836193.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_01T20_06_23.836193", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T20-06-23.836193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T20-06-23.836193.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_01T20_06_23.836193", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T20-06-23.836193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T20-06-23.836193.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_01T20_06_23.836193", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T20-06-23.836193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T20-06-23.836193.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_01T20_06_23.836193", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-01T20-06-23.836193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-01T20-06-23.836193.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_01T20_06_23.836193", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-01T20-06-23.836193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-01T20-06-23.836193.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_01T20_06_23.836193", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-01T20-06-23.836193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-01T20-06-23.836193.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_01T20_06_23.836193", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T20-06-23.836193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T20-06-23.836193.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_01T20_06_23.836193", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-01T20-06-23.836193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-01T20-06-23.836193.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_01T20_06_23.836193", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T20-06-23.836193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T20-06-23.836193.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_01T20_06_23.836193", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T20-06-23.836193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T20-06-23.836193.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_01T20_06_23.836193", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-01T20-06-23.836193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-01T20-06-23.836193.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_01T20_06_23.836193", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-01T20-06-23.836193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-01T20-06-23.836193.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_01T20_06_23.836193", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-01T20-06-23.836193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-01T20-06-23.836193.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_01T20_06_23.836193", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T20-06-23.836193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T20-06-23.836193.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_01T20_06_23.836193", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-01T20-06-23.836193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-01T20-06-23.836193.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_01T20_06_23.836193", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-01T20-06-23.836193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-01T20-06-23.836193.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_01T20_06_23.836193", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-01T20-06-23.836193.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-01T20-06-23.836193.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_01T20_06_23.836193", "path": ["**/details_harness|winogrande|5_2024-02-01T20-06-23.836193.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-01T20-06-23.836193.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_01T20_06_23.836193", "path": ["results_2024-02-01T20-06-23.836193.parquet"]}, {"split": "latest", "path": ["results_2024-02-01T20-06-23.836193.parquet"]}]}]}
2024-02-01T20:09:07+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of FelixChao/Patronum-7B Dataset automatically created during the evaluation run of model FelixChao/Patronum-7B on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-02-01T20:06:23.836193(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of FelixChao/Patronum-7B\n\n\n\nDataset automatically created during the evaluation run of model FelixChao/Patronum-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-02-01T20:06:23.836193(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of FelixChao/Patronum-7B\n\n\n\nDataset automatically created during the evaluation run of model FelixChao/Patronum-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-02-01T20:06:23.836193(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
cc9e741c23b8f3326704ca8e457b9eaec3ae6735
# Dataset Card for Evaluation run of vilm/Mixsmol-4x400M-v0.1-epoch2 <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [vilm/Mixsmol-4x400M-v0.1-epoch2](https://huggingface.co/vilm/Mixsmol-4x400M-v0.1-epoch2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_vilm__Mixsmol-4x400M-v0.1-epoch2", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-02-01T20:12:11.132226](https://huggingface.co/datasets/open-llm-leaderboard/details_vilm__Mixsmol-4x400M-v0.1-epoch2/blob/main/results_2024-02-01T20-12-11.132226.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.25283618854191925, "acc_stderr": 0.03066762235665367, "acc_norm": 0.25354322916408834, "acc_norm_stderr": 0.03146102376966158, "mc1": 0.20318237454100369, "mc1_stderr": 0.014085666526340886, "mc2": 0.3923567669357625, "mc2_stderr": 0.014714668526801372 }, "harness|arc:challenge|25": { "acc": 0.20051194539249148, "acc_stderr": 0.011700318050499382, "acc_norm": 0.2354948805460751, "acc_norm_stderr": 0.01239945185500475 }, "harness|hellaswag|10": { "acc": 0.2962557259510058, "acc_stderr": 0.004556719864763093, "acc_norm": 0.3260306711810396, "acc_norm_stderr": 0.004678006403691724 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.23, "acc_stderr": 0.04229525846816506, "acc_norm": 0.23, "acc_norm_stderr": 0.04229525846816506 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.2, "acc_stderr": 0.034554737023254366, "acc_norm": 0.2, "acc_norm_stderr": 0.034554737023254366 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.17763157894736842, "acc_stderr": 0.031103182383123398, "acc_norm": 0.17763157894736842, "acc_norm_stderr": 0.031103182383123398 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.29, "acc_stderr": 0.04560480215720683, "acc_norm": 0.29, "acc_norm_stderr": 0.04560480215720683 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.22264150943396227, "acc_stderr": 0.025604233470899098, "acc_norm": 0.22264150943396227, "acc_norm_stderr": 0.025604233470899098 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.24305555555555555, "acc_stderr": 0.03586879280080343, "acc_norm": 0.24305555555555555, "acc_norm_stderr": 0.03586879280080343 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.17, "acc_stderr": 0.03775251680686371, "acc_norm": 0.17, "acc_norm_stderr": 0.03775251680686371 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.24, "acc_stderr": 0.04292346959909283, "acc_norm": 0.24, "acc_norm_stderr": 0.04292346959909283 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.32, "acc_stderr": 0.04688261722621503, "acc_norm": 0.32, "acc_norm_stderr": 0.04688261722621503 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.2832369942196532, "acc_stderr": 0.03435568056047874, "acc_norm": 0.2832369942196532, "acc_norm_stderr": 0.03435568056047874 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.23529411764705882, "acc_stderr": 0.04220773659171453, "acc_norm": 0.23529411764705882, "acc_norm_stderr": 0.04220773659171453 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.21, "acc_stderr": 0.04093601807403326, "acc_norm": 0.21, "acc_norm_stderr": 0.04093601807403326 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.26382978723404255, "acc_stderr": 0.02880998985410297, "acc_norm": 0.26382978723404255, "acc_norm_stderr": 0.02880998985410297 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.2719298245614035, "acc_stderr": 0.041857744240220554, "acc_norm": 0.2719298245614035, "acc_norm_stderr": 0.041857744240220554 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.22758620689655173, "acc_stderr": 0.03493950380131183, "acc_norm": 0.22758620689655173, "acc_norm_stderr": 0.03493950380131183 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.24603174603174602, "acc_stderr": 0.022182037202948365, "acc_norm": 0.24603174603174602, "acc_norm_stderr": 0.022182037202948365 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.15873015873015872, "acc_stderr": 0.032684540130117436, "acc_norm": 0.15873015873015872, "acc_norm_stderr": 0.032684540130117436 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.18, "acc_stderr": 0.038612291966536934, "acc_norm": 0.18, "acc_norm_stderr": 0.038612291966536934 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.3064516129032258, "acc_stderr": 0.026226485652553883, "acc_norm": 0.3064516129032258, "acc_norm_stderr": 0.026226485652553883 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.2955665024630542, "acc_stderr": 0.032104944337514575, "acc_norm": 0.2955665024630542, "acc_norm_stderr": 0.032104944337514575 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.29, "acc_stderr": 0.045604802157206845, "acc_norm": 0.29, "acc_norm_stderr": 0.045604802157206845 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.21818181818181817, "acc_stderr": 0.03225078108306289, "acc_norm": 0.21818181818181817, "acc_norm_stderr": 0.03225078108306289 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.25757575757575757, "acc_stderr": 0.031156269519646826, "acc_norm": 0.25757575757575757, "acc_norm_stderr": 0.031156269519646826 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.34196891191709844, "acc_stderr": 0.03423465100104281, "acc_norm": 0.34196891191709844, "acc_norm_stderr": 0.03423465100104281 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.26153846153846155, "acc_stderr": 0.02228214120420442, "acc_norm": 0.26153846153846155, "acc_norm_stderr": 0.02228214120420442 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.26296296296296295, "acc_stderr": 0.026842057873833706, "acc_norm": 0.26296296296296295, "acc_norm_stderr": 0.026842057873833706 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.3319327731092437, "acc_stderr": 0.030588697013783663, "acc_norm": 0.3319327731092437, "acc_norm_stderr": 0.030588697013783663 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.26490066225165565, "acc_stderr": 0.036030385453603826, "acc_norm": 0.26490066225165565, "acc_norm_stderr": 0.036030385453603826 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.22568807339449543, "acc_stderr": 0.01792308766780306, "acc_norm": 0.22568807339449543, "acc_norm_stderr": 0.01792308766780306 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.4537037037037037, "acc_stderr": 0.03395322726375798, "acc_norm": 0.4537037037037037, "acc_norm_stderr": 0.03395322726375798 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.25, "acc_stderr": 0.03039153369274154, "acc_norm": 0.25, "acc_norm_stderr": 0.03039153369274154 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.21518987341772153, "acc_stderr": 0.026750826994676173, "acc_norm": 0.21518987341772153, "acc_norm_stderr": 0.026750826994676173 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.21973094170403587, "acc_stderr": 0.027790177064383602, "acc_norm": 0.21973094170403587, "acc_norm_stderr": 0.027790177064383602 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.26717557251908397, "acc_stderr": 0.038808483010823944, "acc_norm": 0.26717557251908397, "acc_norm_stderr": 0.038808483010823944 }, "harness|hendrycksTest-international_law|5": { "acc": 0.36363636363636365, "acc_stderr": 0.04391326286724071, "acc_norm": 0.36363636363636365, "acc_norm_stderr": 0.04391326286724071 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.25, "acc_stderr": 0.04186091791394607, "acc_norm": 0.25, "acc_norm_stderr": 0.04186091791394607 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.27607361963190186, "acc_stderr": 0.03512385283705051, "acc_norm": 0.27607361963190186, "acc_norm_stderr": 0.03512385283705051 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.1875, "acc_stderr": 0.0370468111477387, "acc_norm": 0.1875, "acc_norm_stderr": 0.0370468111477387 }, "harness|hendrycksTest-management|5": { "acc": 0.17475728155339806, "acc_stderr": 0.037601780060266224, "acc_norm": 0.17475728155339806, "acc_norm_stderr": 0.037601780060266224 }, "harness|hendrycksTest-marketing|5": { "acc": 0.25213675213675213, "acc_stderr": 0.02844796547623101, "acc_norm": 0.25213675213675213, "acc_norm_stderr": 0.02844796547623101 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.3, "acc_stderr": 0.046056618647183814, "acc_norm": 0.3, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.2656449553001277, "acc_stderr": 0.01579430248788872, "acc_norm": 0.2656449553001277, "acc_norm_stderr": 0.01579430248788872 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.23121387283236994, "acc_stderr": 0.022698657167855713, "acc_norm": 0.23121387283236994, "acc_norm_stderr": 0.022698657167855713 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.23128491620111732, "acc_stderr": 0.01410222362315257, "acc_norm": 0.23128491620111732, "acc_norm_stderr": 0.01410222362315257 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.2549019607843137, "acc_stderr": 0.02495418432487991, "acc_norm": 0.2549019607843137, "acc_norm_stderr": 0.02495418432487991 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.24115755627009647, "acc_stderr": 0.02429659403476343, "acc_norm": 0.24115755627009647, "acc_norm_stderr": 0.02429659403476343 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.20987654320987653, "acc_stderr": 0.02265834408598136, "acc_norm": 0.20987654320987653, "acc_norm_stderr": 0.02265834408598136 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.22695035460992907, "acc_stderr": 0.024987106365642966, "acc_norm": 0.22695035460992907, "acc_norm_stderr": 0.024987106365642966 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.2457627118644068, "acc_stderr": 0.010996156635142692, "acc_norm": 0.2457627118644068, "acc_norm_stderr": 0.010996156635142692 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.25, "acc_stderr": 0.026303648393696036, "acc_norm": 0.25, "acc_norm_stderr": 0.026303648393696036 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.25, "acc_stderr": 0.01751781884501444, "acc_norm": 0.25, "acc_norm_stderr": 0.01751781884501444 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.2545454545454545, "acc_stderr": 0.041723430387053825, "acc_norm": 0.2545454545454545, "acc_norm_stderr": 0.041723430387053825 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.24897959183673468, "acc_stderr": 0.027682979522960234, "acc_norm": 0.24897959183673468, "acc_norm_stderr": 0.027682979522960234 }, "harness|hendrycksTest-sociology|5": { "acc": 0.2835820895522388, "acc_stderr": 0.031871875379197966, "acc_norm": 0.2835820895522388, "acc_norm_stderr": 0.031871875379197966 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.31, "acc_stderr": 0.04648231987117316, "acc_norm": 0.31, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-virology|5": { "acc": 0.2289156626506024, "acc_stderr": 0.03270745277352477, "acc_norm": 0.2289156626506024, "acc_norm_stderr": 0.03270745277352477 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.22807017543859648, "acc_stderr": 0.03218093795602357, "acc_norm": 0.22807017543859648, "acc_norm_stderr": 0.03218093795602357 }, "harness|truthfulqa:mc|0": { "mc1": 0.20318237454100369, "mc1_stderr": 0.014085666526340886, "mc2": 0.3923567669357625, "mc2_stderr": 0.014714668526801372 }, "harness|winogrande|5": { "acc": 0.526440410418311, "acc_stderr": 0.014032823874407224 }, "harness|gsm8k|5": { "acc": 0.002274450341167551, "acc_stderr": 0.0013121578148674261 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_vilm__Mixsmol-4x400M-v0.1-epoch2
[ "region:us" ]
2024-02-01T20:14:37+00:00
{"pretty_name": "Evaluation run of vilm/Mixsmol-4x400M-v0.1-epoch2", "dataset_summary": "Dataset automatically created during the evaluation run of model [vilm/Mixsmol-4x400M-v0.1-epoch2](https://huggingface.co/vilm/Mixsmol-4x400M-v0.1-epoch2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_vilm__Mixsmol-4x400M-v0.1-epoch2\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-01T20:12:11.132226](https://huggingface.co/datasets/open-llm-leaderboard/details_vilm__Mixsmol-4x400M-v0.1-epoch2/blob/main/results_2024-02-01T20-12-11.132226.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.25283618854191925,\n \"acc_stderr\": 0.03066762235665367,\n \"acc_norm\": 0.25354322916408834,\n \"acc_norm_stderr\": 0.03146102376966158,\n \"mc1\": 0.20318237454100369,\n \"mc1_stderr\": 0.014085666526340886,\n \"mc2\": 0.3923567669357625,\n \"mc2_stderr\": 0.014714668526801372\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.20051194539249148,\n \"acc_stderr\": 0.011700318050499382,\n \"acc_norm\": 0.2354948805460751,\n \"acc_norm_stderr\": 0.01239945185500475\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.2962557259510058,\n \"acc_stderr\": 0.004556719864763093,\n \"acc_norm\": 0.3260306711810396,\n \"acc_norm_stderr\": 0.004678006403691724\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816506,\n \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816506\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.2,\n \"acc_stderr\": 0.034554737023254366,\n \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.034554737023254366\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.17763157894736842,\n \"acc_stderr\": 0.031103182383123398,\n \"acc_norm\": 0.17763157894736842,\n \"acc_norm_stderr\": 0.031103182383123398\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720683,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720683\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.22264150943396227,\n \"acc_stderr\": 0.025604233470899098,\n \"acc_norm\": 0.22264150943396227,\n \"acc_norm_stderr\": 0.025604233470899098\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.24305555555555555,\n \"acc_stderr\": 0.03586879280080343,\n \"acc_norm\": 0.24305555555555555,\n \"acc_norm_stderr\": 0.03586879280080343\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.17,\n \"acc_stderr\": 0.03775251680686371,\n \"acc_norm\": 0.17,\n \"acc_norm_stderr\": 0.03775251680686371\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909283,\n \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909283\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621503,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621503\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.2832369942196532,\n \"acc_stderr\": 0.03435568056047874,\n \"acc_norm\": 0.2832369942196532,\n \"acc_norm_stderr\": 0.03435568056047874\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.23529411764705882,\n \"acc_stderr\": 0.04220773659171453,\n \"acc_norm\": 0.23529411764705882,\n \"acc_norm_stderr\": 0.04220773659171453\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.21,\n \"acc_stderr\": 0.04093601807403326,\n \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.04093601807403326\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.26382978723404255,\n \"acc_stderr\": 0.02880998985410297,\n \"acc_norm\": 0.26382978723404255,\n \"acc_norm_stderr\": 0.02880998985410297\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2719298245614035,\n \"acc_stderr\": 0.041857744240220554,\n \"acc_norm\": 0.2719298245614035,\n \"acc_norm_stderr\": 0.041857744240220554\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.22758620689655173,\n \"acc_stderr\": 0.03493950380131183,\n \"acc_norm\": 0.22758620689655173,\n \"acc_norm_stderr\": 0.03493950380131183\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.24603174603174602,\n \"acc_stderr\": 0.022182037202948365,\n \"acc_norm\": 0.24603174603174602,\n \"acc_norm_stderr\": 0.022182037202948365\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.15873015873015872,\n \"acc_stderr\": 0.032684540130117436,\n \"acc_norm\": 0.15873015873015872,\n \"acc_norm_stderr\": 0.032684540130117436\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.18,\n \"acc_stderr\": 0.038612291966536934,\n \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.038612291966536934\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.3064516129032258,\n \"acc_stderr\": 0.026226485652553883,\n \"acc_norm\": 0.3064516129032258,\n \"acc_norm_stderr\": 0.026226485652553883\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.2955665024630542,\n \"acc_stderr\": 0.032104944337514575,\n \"acc_norm\": 0.2955665024630542,\n \"acc_norm_stderr\": 0.032104944337514575\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03225078108306289,\n \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03225078108306289\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.25757575757575757,\n \"acc_stderr\": 0.031156269519646826,\n \"acc_norm\": 0.25757575757575757,\n \"acc_norm_stderr\": 0.031156269519646826\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.34196891191709844,\n \"acc_stderr\": 0.03423465100104281,\n \"acc_norm\": 0.34196891191709844,\n \"acc_norm_stderr\": 0.03423465100104281\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.26153846153846155,\n \"acc_stderr\": 0.02228214120420442,\n \"acc_norm\": 0.26153846153846155,\n \"acc_norm_stderr\": 0.02228214120420442\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.26296296296296295,\n \"acc_stderr\": 0.026842057873833706,\n \"acc_norm\": 0.26296296296296295,\n \"acc_norm_stderr\": 0.026842057873833706\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.3319327731092437,\n \"acc_stderr\": 0.030588697013783663,\n \"acc_norm\": 0.3319327731092437,\n \"acc_norm_stderr\": 0.030588697013783663\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.26490066225165565,\n \"acc_stderr\": 0.036030385453603826,\n \"acc_norm\": 0.26490066225165565,\n \"acc_norm_stderr\": 0.036030385453603826\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.22568807339449543,\n \"acc_stderr\": 0.01792308766780306,\n \"acc_norm\": 0.22568807339449543,\n \"acc_norm_stderr\": 0.01792308766780306\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4537037037037037,\n \"acc_stderr\": 0.03395322726375798,\n \"acc_norm\": 0.4537037037037037,\n \"acc_norm_stderr\": 0.03395322726375798\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.03039153369274154,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.03039153369274154\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.21518987341772153,\n \"acc_stderr\": 0.026750826994676173,\n \"acc_norm\": 0.21518987341772153,\n \"acc_norm_stderr\": 0.026750826994676173\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.21973094170403587,\n \"acc_stderr\": 0.027790177064383602,\n \"acc_norm\": 0.21973094170403587,\n \"acc_norm_stderr\": 0.027790177064383602\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.26717557251908397,\n \"acc_stderr\": 0.038808483010823944,\n \"acc_norm\": 0.26717557251908397,\n \"acc_norm_stderr\": 0.038808483010823944\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.36363636363636365,\n \"acc_stderr\": 0.04391326286724071,\n \"acc_norm\": 0.36363636363636365,\n \"acc_norm_stderr\": 0.04391326286724071\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04186091791394607,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04186091791394607\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.27607361963190186,\n \"acc_stderr\": 0.03512385283705051,\n \"acc_norm\": 0.27607361963190186,\n \"acc_norm_stderr\": 0.03512385283705051\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.1875,\n \"acc_stderr\": 0.0370468111477387,\n \"acc_norm\": 0.1875,\n \"acc_norm_stderr\": 0.0370468111477387\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.17475728155339806,\n \"acc_stderr\": 0.037601780060266224,\n \"acc_norm\": 0.17475728155339806,\n \"acc_norm_stderr\": 0.037601780060266224\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.25213675213675213,\n \"acc_stderr\": 0.02844796547623101,\n \"acc_norm\": 0.25213675213675213,\n \"acc_norm_stderr\": 0.02844796547623101\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.2656449553001277,\n \"acc_stderr\": 0.01579430248788872,\n \"acc_norm\": 0.2656449553001277,\n \"acc_norm_stderr\": 0.01579430248788872\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.23121387283236994,\n \"acc_stderr\": 0.022698657167855713,\n \"acc_norm\": 0.23121387283236994,\n \"acc_norm_stderr\": 0.022698657167855713\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23128491620111732,\n \"acc_stderr\": 0.01410222362315257,\n \"acc_norm\": 0.23128491620111732,\n \"acc_norm_stderr\": 0.01410222362315257\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.2549019607843137,\n \"acc_stderr\": 0.02495418432487991,\n \"acc_norm\": 0.2549019607843137,\n \"acc_norm_stderr\": 0.02495418432487991\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.24115755627009647,\n \"acc_stderr\": 0.02429659403476343,\n \"acc_norm\": 0.24115755627009647,\n \"acc_norm_stderr\": 0.02429659403476343\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.20987654320987653,\n \"acc_stderr\": 0.02265834408598136,\n \"acc_norm\": 0.20987654320987653,\n \"acc_norm_stderr\": 0.02265834408598136\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.22695035460992907,\n \"acc_stderr\": 0.024987106365642966,\n \"acc_norm\": 0.22695035460992907,\n \"acc_norm_stderr\": 0.024987106365642966\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2457627118644068,\n \"acc_stderr\": 0.010996156635142692,\n \"acc_norm\": 0.2457627118644068,\n \"acc_norm_stderr\": 0.010996156635142692\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.026303648393696036,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.026303648393696036\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.01751781884501444,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.01751781884501444\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.2545454545454545,\n \"acc_stderr\": 0.041723430387053825,\n \"acc_norm\": 0.2545454545454545,\n \"acc_norm_stderr\": 0.041723430387053825\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.24897959183673468,\n \"acc_stderr\": 0.027682979522960234,\n \"acc_norm\": 0.24897959183673468,\n \"acc_norm_stderr\": 0.027682979522960234\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.2835820895522388,\n \"acc_stderr\": 0.031871875379197966,\n \"acc_norm\": 0.2835820895522388,\n \"acc_norm_stderr\": 0.031871875379197966\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.2289156626506024,\n \"acc_stderr\": 0.03270745277352477,\n \"acc_norm\": 0.2289156626506024,\n \"acc_norm_stderr\": 0.03270745277352477\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.22807017543859648,\n \"acc_stderr\": 0.03218093795602357,\n \"acc_norm\": 0.22807017543859648,\n \"acc_norm_stderr\": 0.03218093795602357\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.20318237454100369,\n \"mc1_stderr\": 0.014085666526340886,\n \"mc2\": 0.3923567669357625,\n \"mc2_stderr\": 0.014714668526801372\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.526440410418311,\n \"acc_stderr\": 0.014032823874407224\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.002274450341167551,\n \"acc_stderr\": 0.0013121578148674261\n }\n}\n```", "repo_url": "https://huggingface.co/vilm/Mixsmol-4x400M-v0.1-epoch2", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_01T20_12_11.132226", "path": ["**/details_harness|arc:challenge|25_2024-02-01T20-12-11.132226.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-01T20-12-11.132226.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_01T20_12_11.132226", "path": ["**/details_harness|gsm8k|5_2024-02-01T20-12-11.132226.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-01T20-12-11.132226.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_01T20_12_11.132226", "path": ["**/details_harness|hellaswag|10_2024-02-01T20-12-11.132226.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-01T20-12-11.132226.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_01T20_12_11.132226", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T20-12-11.132226.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-01T20-12-11.132226.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-01T20-12-11.132226.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T20-12-11.132226.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T20-12-11.132226.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-01T20-12-11.132226.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T20-12-11.132226.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T20-12-11.132226.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T20-12-11.132226.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T20-12-11.132226.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-01T20-12-11.132226.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-01T20-12-11.132226.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T20-12-11.132226.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-01T20-12-11.132226.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T20-12-11.132226.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T20-12-11.132226.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T20-12-11.132226.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-01T20-12-11.132226.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T20-12-11.132226.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T20-12-11.132226.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T20-12-11.132226.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T20-12-11.132226.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T20-12-11.132226.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T20-12-11.132226.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T20-12-11.132226.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T20-12-11.132226.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T20-12-11.132226.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T20-12-11.132226.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T20-12-11.132226.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T20-12-11.132226.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T20-12-11.132226.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T20-12-11.132226.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-01T20-12-11.132226.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T20-12-11.132226.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-01T20-12-11.132226.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T20-12-11.132226.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T20-12-11.132226.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T20-12-11.132226.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-01T20-12-11.132226.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-01T20-12-11.132226.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T20-12-11.132226.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T20-12-11.132226.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T20-12-11.132226.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T20-12-11.132226.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-01T20-12-11.132226.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-01T20-12-11.132226.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-01T20-12-11.132226.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T20-12-11.132226.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-01T20-12-11.132226.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T20-12-11.132226.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T20-12-11.132226.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-01T20-12-11.132226.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-01T20-12-11.132226.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-01T20-12-11.132226.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T20-12-11.132226.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-01T20-12-11.132226.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-01T20-12-11.132226.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T20-12-11.132226.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-01T20-12-11.132226.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-01T20-12-11.132226.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T20-12-11.132226.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T20-12-11.132226.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-01T20-12-11.132226.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T20-12-11.132226.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T20-12-11.132226.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T20-12-11.132226.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T20-12-11.132226.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-01T20-12-11.132226.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-01T20-12-11.132226.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T20-12-11.132226.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-01T20-12-11.132226.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T20-12-11.132226.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T20-12-11.132226.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T20-12-11.132226.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-01T20-12-11.132226.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T20-12-11.132226.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T20-12-11.132226.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T20-12-11.132226.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T20-12-11.132226.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T20-12-11.132226.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T20-12-11.132226.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T20-12-11.132226.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T20-12-11.132226.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T20-12-11.132226.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T20-12-11.132226.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T20-12-11.132226.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T20-12-11.132226.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T20-12-11.132226.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T20-12-11.132226.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-01T20-12-11.132226.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T20-12-11.132226.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-01T20-12-11.132226.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T20-12-11.132226.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T20-12-11.132226.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T20-12-11.132226.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-01T20-12-11.132226.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-01T20-12-11.132226.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T20-12-11.132226.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T20-12-11.132226.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T20-12-11.132226.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T20-12-11.132226.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-01T20-12-11.132226.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-01T20-12-11.132226.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-01T20-12-11.132226.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T20-12-11.132226.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-01T20-12-11.132226.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T20-12-11.132226.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T20-12-11.132226.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-01T20-12-11.132226.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-01T20-12-11.132226.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-01T20-12-11.132226.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T20-12-11.132226.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-01T20-12-11.132226.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-01T20-12-11.132226.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_01T20_12_11.132226", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T20-12-11.132226.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T20-12-11.132226.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_01T20_12_11.132226", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-01T20-12-11.132226.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-01T20-12-11.132226.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_01T20_12_11.132226", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-01T20-12-11.132226.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-01T20-12-11.132226.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_01T20_12_11.132226", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T20-12-11.132226.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T20-12-11.132226.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_01T20_12_11.132226", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T20-12-11.132226.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T20-12-11.132226.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_01T20_12_11.132226", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-01T20-12-11.132226.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-01T20-12-11.132226.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_01T20_12_11.132226", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T20-12-11.132226.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T20-12-11.132226.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_01T20_12_11.132226", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T20-12-11.132226.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T20-12-11.132226.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_01T20_12_11.132226", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T20-12-11.132226.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T20-12-11.132226.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_01T20_12_11.132226", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T20-12-11.132226.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T20-12-11.132226.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_01T20_12_11.132226", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-01T20-12-11.132226.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-01T20-12-11.132226.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_01T20_12_11.132226", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-01T20-12-11.132226.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-01T20-12-11.132226.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_01T20_12_11.132226", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T20-12-11.132226.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T20-12-11.132226.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_01T20_12_11.132226", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-01T20-12-11.132226.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-01T20-12-11.132226.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_01T20_12_11.132226", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T20-12-11.132226.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T20-12-11.132226.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_01T20_12_11.132226", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T20-12-11.132226.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T20-12-11.132226.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_01T20_12_11.132226", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T20-12-11.132226.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T20-12-11.132226.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_01T20_12_11.132226", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-01T20-12-11.132226.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-01T20-12-11.132226.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_01T20_12_11.132226", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T20-12-11.132226.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T20-12-11.132226.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_01T20_12_11.132226", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T20-12-11.132226.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T20-12-11.132226.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_01T20_12_11.132226", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T20-12-11.132226.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T20-12-11.132226.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_01T20_12_11.132226", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T20-12-11.132226.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T20-12-11.132226.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_01T20_12_11.132226", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T20-12-11.132226.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T20-12-11.132226.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_01T20_12_11.132226", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T20-12-11.132226.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T20-12-11.132226.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_01T20_12_11.132226", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T20-12-11.132226.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T20-12-11.132226.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_01T20_12_11.132226", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T20-12-11.132226.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T20-12-11.132226.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_01T20_12_11.132226", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T20-12-11.132226.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T20-12-11.132226.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_01T20_12_11.132226", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T20-12-11.132226.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T20-12-11.132226.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_01T20_12_11.132226", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T20-12-11.132226.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T20-12-11.132226.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_01T20_12_11.132226", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T20-12-11.132226.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T20-12-11.132226.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_01T20_12_11.132226", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T20-12-11.132226.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T20-12-11.132226.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_01T20_12_11.132226", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T20-12-11.132226.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T20-12-11.132226.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_01T20_12_11.132226", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-01T20-12-11.132226.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-01T20-12-11.132226.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_01T20_12_11.132226", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T20-12-11.132226.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T20-12-11.132226.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_01T20_12_11.132226", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-01T20-12-11.132226.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-01T20-12-11.132226.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_01T20_12_11.132226", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T20-12-11.132226.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T20-12-11.132226.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_01T20_12_11.132226", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T20-12-11.132226.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T20-12-11.132226.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_01T20_12_11.132226", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T20-12-11.132226.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T20-12-11.132226.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_01T20_12_11.132226", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-01T20-12-11.132226.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-01T20-12-11.132226.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_01T20_12_11.132226", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-01T20-12-11.132226.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-01T20-12-11.132226.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_01T20_12_11.132226", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T20-12-11.132226.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T20-12-11.132226.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_01T20_12_11.132226", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T20-12-11.132226.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T20-12-11.132226.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_01T20_12_11.132226", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T20-12-11.132226.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T20-12-11.132226.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_01T20_12_11.132226", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T20-12-11.132226.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T20-12-11.132226.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_01T20_12_11.132226", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-01T20-12-11.132226.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-01T20-12-11.132226.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_01T20_12_11.132226", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-01T20-12-11.132226.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-01T20-12-11.132226.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_01T20_12_11.132226", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-01T20-12-11.132226.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-01T20-12-11.132226.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_01T20_12_11.132226", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T20-12-11.132226.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T20-12-11.132226.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_01T20_12_11.132226", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-01T20-12-11.132226.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-01T20-12-11.132226.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_01T20_12_11.132226", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T20-12-11.132226.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T20-12-11.132226.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_01T20_12_11.132226", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T20-12-11.132226.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T20-12-11.132226.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_01T20_12_11.132226", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-01T20-12-11.132226.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-01T20-12-11.132226.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_01T20_12_11.132226", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-01T20-12-11.132226.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-01T20-12-11.132226.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_01T20_12_11.132226", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-01T20-12-11.132226.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-01T20-12-11.132226.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_01T20_12_11.132226", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T20-12-11.132226.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T20-12-11.132226.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_01T20_12_11.132226", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-01T20-12-11.132226.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-01T20-12-11.132226.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_01T20_12_11.132226", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-01T20-12-11.132226.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-01T20-12-11.132226.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_01T20_12_11.132226", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-01T20-12-11.132226.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-01T20-12-11.132226.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_01T20_12_11.132226", "path": ["**/details_harness|winogrande|5_2024-02-01T20-12-11.132226.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-01T20-12-11.132226.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_01T20_12_11.132226", "path": ["results_2024-02-01T20-12-11.132226.parquet"]}, {"split": "latest", "path": ["results_2024-02-01T20-12-11.132226.parquet"]}]}]}
2024-02-01T20:15:01+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of vilm/Mixsmol-4x400M-v0.1-epoch2 Dataset automatically created during the evaluation run of model vilm/Mixsmol-4x400M-v0.1-epoch2 on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-02-01T20:12:11.132226(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of vilm/Mixsmol-4x400M-v0.1-epoch2\n\n\n\nDataset automatically created during the evaluation run of model vilm/Mixsmol-4x400M-v0.1-epoch2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-02-01T20:12:11.132226(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of vilm/Mixsmol-4x400M-v0.1-epoch2\n\n\n\nDataset automatically created during the evaluation run of model vilm/Mixsmol-4x400M-v0.1-epoch2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-02-01T20:12:11.132226(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
0fe485fd6a929d036e5347209c3d75addd7ae731
# Test Case Steps Synthetically generated test case steps for an example booking system.
2bittester/test-case-steps
[ "language:eng", "license:mit", "Synthetic Data", "region:us" ]
2024-02-01T20:15:29+00:00
{"language": ["eng"], "license": "mit", "pretty_name": "Test Case Steps", "tags": ["Synthetic Data"]}
2024-02-01T20:32:14+00:00
[]
[ "eng" ]
TAGS #language-English #license-mit #Synthetic Data #region-us
# Test Case Steps Synthetically generated test case steps for an example booking system.
[ "# Test Case Steps\n\nSynthetically generated test case steps for an example booking system." ]
[ "TAGS\n#language-English #license-mit #Synthetic Data #region-us \n", "# Test Case Steps\n\nSynthetically generated test case steps for an example booking system." ]
f5cbb55d9af8869b168697df38c7d694a65faec3
# Dataset Card for Evaluation run of abhishekchohan/mistral-7B-forest-dpo <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [abhishekchohan/mistral-7B-forest-dpo](https://huggingface.co/abhishekchohan/mistral-7B-forest-dpo) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_abhishekchohan__mistral-7B-forest-dpo", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-02-01T20:33:21.801707](https://huggingface.co/datasets/open-llm-leaderboard/details_abhishekchohan__mistral-7B-forest-dpo/blob/main/results_2024-02-01T20-33-21.801707.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6284857474273647, "acc_stderr": 0.03255562209628959, "acc_norm": 0.6347376538626387, "acc_norm_stderr": 0.0332275449377842, "mc1": 0.39412484700122397, "mc1_stderr": 0.017106588140700325, "mc2": 0.554347972654007, "mc2_stderr": 0.01584708837699472 }, "harness|arc:challenge|25": { "acc": 0.6168941979522184, "acc_stderr": 0.01420647266167288, "acc_norm": 0.6501706484641638, "acc_norm_stderr": 0.013936809212158294 }, "harness|hellaswag|10": { "acc": 0.6856203943437562, "acc_stderr": 0.004633194825793845, "acc_norm": 0.8630750846444931, "acc_norm_stderr": 0.0034306550069275778 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.31, "acc_stderr": 0.04648231987117316, "acc_norm": 0.31, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.6148148148148148, "acc_stderr": 0.04203921040156279, "acc_norm": 0.6148148148148148, "acc_norm_stderr": 0.04203921040156279 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.6447368421052632, "acc_stderr": 0.038947344870133176, "acc_norm": 0.6447368421052632, "acc_norm_stderr": 0.038947344870133176 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.55, "acc_stderr": 0.05, "acc_norm": 0.55, "acc_norm_stderr": 0.05 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.6830188679245283, "acc_stderr": 0.028637235639800897, "acc_norm": 0.6830188679245283, "acc_norm_stderr": 0.028637235639800897 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.7152777777777778, "acc_stderr": 0.037738099906869334, "acc_norm": 0.7152777777777778, "acc_norm_stderr": 0.037738099906869334 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.48, "acc_stderr": 0.050211673156867795, "acc_norm": 0.48, "acc_norm_stderr": 0.050211673156867795 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.55, "acc_stderr": 0.05, "acc_norm": 0.55, "acc_norm_stderr": 0.05 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.35, "acc_stderr": 0.047937248544110196, "acc_norm": 0.35, "acc_norm_stderr": 0.047937248544110196 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6416184971098265, "acc_stderr": 0.03656343653353159, "acc_norm": 0.6416184971098265, "acc_norm_stderr": 0.03656343653353159 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.46078431372549017, "acc_stderr": 0.049598599663841815, "acc_norm": 0.46078431372549017, "acc_norm_stderr": 0.049598599663841815 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.79, "acc_stderr": 0.04093601807403326, "acc_norm": 0.79, "acc_norm_stderr": 0.04093601807403326 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.5276595744680851, "acc_stderr": 0.03263597118409769, "acc_norm": 0.5276595744680851, "acc_norm_stderr": 0.03263597118409769 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.5175438596491229, "acc_stderr": 0.04700708033551038, "acc_norm": 0.5175438596491229, "acc_norm_stderr": 0.04700708033551038 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5724137931034483, "acc_stderr": 0.04122737111370332, "acc_norm": 0.5724137931034483, "acc_norm_stderr": 0.04122737111370332 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.3968253968253968, "acc_stderr": 0.025197101074246487, "acc_norm": 0.3968253968253968, "acc_norm_stderr": 0.025197101074246487 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.4126984126984127, "acc_stderr": 0.04403438954768176, "acc_norm": 0.4126984126984127, "acc_norm_stderr": 0.04403438954768176 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.33, "acc_stderr": 0.04725815626252604, "acc_norm": 0.33, "acc_norm_stderr": 0.04725815626252604 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.7419354838709677, "acc_stderr": 0.024892469172462833, "acc_norm": 0.7419354838709677, "acc_norm_stderr": 0.024892469172462833 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.4975369458128079, "acc_stderr": 0.03517945038691063, "acc_norm": 0.4975369458128079, "acc_norm_stderr": 0.03517945038691063 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.64, "acc_stderr": 0.04824181513244218, "acc_norm": 0.64, "acc_norm_stderr": 0.04824181513244218 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7333333333333333, "acc_stderr": 0.03453131801885417, "acc_norm": 0.7333333333333333, "acc_norm_stderr": 0.03453131801885417 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.7929292929292929, "acc_stderr": 0.02886977846026705, "acc_norm": 0.7929292929292929, "acc_norm_stderr": 0.02886977846026705 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.8652849740932642, "acc_stderr": 0.02463978909770944, "acc_norm": 0.8652849740932642, "acc_norm_stderr": 0.02463978909770944 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.6410256410256411, "acc_stderr": 0.024321738484602354, "acc_norm": 0.6410256410256411, "acc_norm_stderr": 0.024321738484602354 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.3, "acc_stderr": 0.027940457136228412, "acc_norm": 0.3, "acc_norm_stderr": 0.027940457136228412 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.634453781512605, "acc_stderr": 0.03128217706368461, "acc_norm": 0.634453781512605, "acc_norm_stderr": 0.03128217706368461 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.33112582781456956, "acc_stderr": 0.038425817186598696, "acc_norm": 0.33112582781456956, "acc_norm_stderr": 0.038425817186598696 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8201834862385321, "acc_stderr": 0.016465345467391552, "acc_norm": 0.8201834862385321, "acc_norm_stderr": 0.016465345467391552 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.5509259259259259, "acc_stderr": 0.03392238405321617, "acc_norm": 0.5509259259259259, "acc_norm_stderr": 0.03392238405321617 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.7843137254901961, "acc_stderr": 0.028867431449849323, "acc_norm": 0.7843137254901961, "acc_norm_stderr": 0.028867431449849323 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.7679324894514767, "acc_stderr": 0.027479744550808503, "acc_norm": 0.7679324894514767, "acc_norm_stderr": 0.027479744550808503 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.672645739910314, "acc_stderr": 0.03149384670994131, "acc_norm": 0.672645739910314, "acc_norm_stderr": 0.03149384670994131 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.7862595419847328, "acc_stderr": 0.0359546161177469, "acc_norm": 0.7862595419847328, "acc_norm_stderr": 0.0359546161177469 }, "harness|hendrycksTest-international_law|5": { "acc": 0.7520661157024794, "acc_stderr": 0.03941897526516302, "acc_norm": 0.7520661157024794, "acc_norm_stderr": 0.03941897526516302 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7685185185185185, "acc_stderr": 0.04077494709252626, "acc_norm": 0.7685185185185185, "acc_norm_stderr": 0.04077494709252626 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7914110429447853, "acc_stderr": 0.031921934489347235, "acc_norm": 0.7914110429447853, "acc_norm_stderr": 0.031921934489347235 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.49107142857142855, "acc_stderr": 0.04745033255489123, "acc_norm": 0.49107142857142855, "acc_norm_stderr": 0.04745033255489123 }, "harness|hendrycksTest-management|5": { "acc": 0.8058252427184466, "acc_stderr": 0.03916667762822585, "acc_norm": 0.8058252427184466, "acc_norm_stderr": 0.03916667762822585 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8547008547008547, "acc_stderr": 0.0230866350868414, "acc_norm": 0.8547008547008547, "acc_norm_stderr": 0.0230866350868414 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.73, "acc_stderr": 0.044619604333847394, "acc_norm": 0.73, "acc_norm_stderr": 0.044619604333847394 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.7994891443167306, "acc_stderr": 0.014317653708594204, "acc_norm": 0.7994891443167306, "acc_norm_stderr": 0.014317653708594204 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.7225433526011561, "acc_stderr": 0.024105712607754307, "acc_norm": 0.7225433526011561, "acc_norm_stderr": 0.024105712607754307 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.38212290502793295, "acc_stderr": 0.016251139711570772, "acc_norm": 0.38212290502793295, "acc_norm_stderr": 0.016251139711570772 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.7189542483660131, "acc_stderr": 0.02573885479781872, "acc_norm": 0.7189542483660131, "acc_norm_stderr": 0.02573885479781872 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.6977491961414791, "acc_stderr": 0.026082700695399665, "acc_norm": 0.6977491961414791, "acc_norm_stderr": 0.026082700695399665 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.7160493827160493, "acc_stderr": 0.02508947852376513, "acc_norm": 0.7160493827160493, "acc_norm_stderr": 0.02508947852376513 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.4432624113475177, "acc_stderr": 0.029634838473766006, "acc_norm": 0.4432624113475177, "acc_norm_stderr": 0.029634838473766006 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.439374185136897, "acc_stderr": 0.012676014778580215, "acc_norm": 0.439374185136897, "acc_norm_stderr": 0.012676014778580215 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.6911764705882353, "acc_stderr": 0.02806499816704009, "acc_norm": 0.6911764705882353, "acc_norm_stderr": 0.02806499816704009 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6584967320261438, "acc_stderr": 0.01918463932809249, "acc_norm": 0.6584967320261438, "acc_norm_stderr": 0.01918463932809249 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6363636363636364, "acc_stderr": 0.04607582090719976, "acc_norm": 0.6363636363636364, "acc_norm_stderr": 0.04607582090719976 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.6816326530612244, "acc_stderr": 0.029822533793982062, "acc_norm": 0.6816326530612244, "acc_norm_stderr": 0.029822533793982062 }, "harness|hendrycksTest-sociology|5": { "acc": 0.7960199004975125, "acc_stderr": 0.02849317624532607, "acc_norm": 0.7960199004975125, "acc_norm_stderr": 0.02849317624532607 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.85, "acc_stderr": 0.03588702812826371, "acc_norm": 0.85, "acc_norm_stderr": 0.03588702812826371 }, "harness|hendrycksTest-virology|5": { "acc": 0.5542168674698795, "acc_stderr": 0.038695433234721015, "acc_norm": 0.5542168674698795, "acc_norm_stderr": 0.038695433234721015 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.847953216374269, "acc_stderr": 0.02753912288906145, "acc_norm": 0.847953216374269, "acc_norm_stderr": 0.02753912288906145 }, "harness|truthfulqa:mc|0": { "mc1": 0.39412484700122397, "mc1_stderr": 0.017106588140700325, "mc2": 0.554347972654007, "mc2_stderr": 0.01584708837699472 }, "harness|winogrande|5": { "acc": 0.7955801104972375, "acc_stderr": 0.011334090612597212 }, "harness|gsm8k|5": { "acc": 0.3032600454890068, "acc_stderr": 0.012661502663418698 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_abhishekchohan__mistral-7B-forest-dpo
[ "region:us" ]
2024-02-01T20:35:44+00:00
{"pretty_name": "Evaluation run of abhishekchohan/mistral-7B-forest-dpo", "dataset_summary": "Dataset automatically created during the evaluation run of model [abhishekchohan/mistral-7B-forest-dpo](https://huggingface.co/abhishekchohan/mistral-7B-forest-dpo) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_abhishekchohan__mistral-7B-forest-dpo\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-01T20:33:21.801707](https://huggingface.co/datasets/open-llm-leaderboard/details_abhishekchohan__mistral-7B-forest-dpo/blob/main/results_2024-02-01T20-33-21.801707.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6284857474273647,\n \"acc_stderr\": 0.03255562209628959,\n \"acc_norm\": 0.6347376538626387,\n \"acc_norm_stderr\": 0.0332275449377842,\n \"mc1\": 0.39412484700122397,\n \"mc1_stderr\": 0.017106588140700325,\n \"mc2\": 0.554347972654007,\n \"mc2_stderr\": 0.01584708837699472\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6168941979522184,\n \"acc_stderr\": 0.01420647266167288,\n \"acc_norm\": 0.6501706484641638,\n \"acc_norm_stderr\": 0.013936809212158294\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6856203943437562,\n \"acc_stderr\": 0.004633194825793845,\n \"acc_norm\": 0.8630750846444931,\n \"acc_norm_stderr\": 0.0034306550069275778\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6148148148148148,\n \"acc_stderr\": 0.04203921040156279,\n \"acc_norm\": 0.6148148148148148,\n \"acc_norm_stderr\": 0.04203921040156279\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6447368421052632,\n \"acc_stderr\": 0.038947344870133176,\n \"acc_norm\": 0.6447368421052632,\n \"acc_norm_stderr\": 0.038947344870133176\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6830188679245283,\n \"acc_stderr\": 0.028637235639800897,\n \"acc_norm\": 0.6830188679245283,\n \"acc_norm_stderr\": 0.028637235639800897\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7152777777777778,\n \"acc_stderr\": 0.037738099906869334,\n \"acc_norm\": 0.7152777777777778,\n \"acc_norm_stderr\": 0.037738099906869334\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6416184971098265,\n \"acc_stderr\": 0.03656343653353159,\n \"acc_norm\": 0.6416184971098265,\n \"acc_norm_stderr\": 0.03656343653353159\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.46078431372549017,\n \"acc_stderr\": 0.049598599663841815,\n \"acc_norm\": 0.46078431372549017,\n \"acc_norm_stderr\": 0.049598599663841815\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.79,\n \"acc_stderr\": 0.04093601807403326,\n \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.04093601807403326\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5276595744680851,\n \"acc_stderr\": 0.03263597118409769,\n \"acc_norm\": 0.5276595744680851,\n \"acc_norm_stderr\": 0.03263597118409769\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5175438596491229,\n \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.5175438596491229,\n \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5724137931034483,\n \"acc_stderr\": 0.04122737111370332,\n \"acc_norm\": 0.5724137931034483,\n \"acc_norm_stderr\": 0.04122737111370332\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3968253968253968,\n \"acc_stderr\": 0.025197101074246487,\n \"acc_norm\": 0.3968253968253968,\n \"acc_norm_stderr\": 0.025197101074246487\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4126984126984127,\n \"acc_stderr\": 0.04403438954768176,\n \"acc_norm\": 0.4126984126984127,\n \"acc_norm_stderr\": 0.04403438954768176\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7419354838709677,\n \"acc_stderr\": 0.024892469172462833,\n \"acc_norm\": 0.7419354838709677,\n \"acc_norm_stderr\": 0.024892469172462833\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4975369458128079,\n \"acc_stderr\": 0.03517945038691063,\n \"acc_norm\": 0.4975369458128079,\n \"acc_norm_stderr\": 0.03517945038691063\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7333333333333333,\n \"acc_stderr\": 0.03453131801885417,\n \"acc_norm\": 0.7333333333333333,\n \"acc_norm_stderr\": 0.03453131801885417\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7929292929292929,\n \"acc_stderr\": 0.02886977846026705,\n \"acc_norm\": 0.7929292929292929,\n \"acc_norm_stderr\": 0.02886977846026705\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8652849740932642,\n \"acc_stderr\": 0.02463978909770944,\n \"acc_norm\": 0.8652849740932642,\n \"acc_norm_stderr\": 0.02463978909770944\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6410256410256411,\n \"acc_stderr\": 0.024321738484602354,\n \"acc_norm\": 0.6410256410256411,\n \"acc_norm_stderr\": 0.024321738484602354\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.027940457136228412,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.027940457136228412\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.634453781512605,\n \"acc_stderr\": 0.03128217706368461,\n \"acc_norm\": 0.634453781512605,\n \"acc_norm_stderr\": 0.03128217706368461\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8201834862385321,\n \"acc_stderr\": 0.016465345467391552,\n \"acc_norm\": 0.8201834862385321,\n \"acc_norm_stderr\": 0.016465345467391552\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5509259259259259,\n \"acc_stderr\": 0.03392238405321617,\n \"acc_norm\": 0.5509259259259259,\n \"acc_norm_stderr\": 0.03392238405321617\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7843137254901961,\n \"acc_stderr\": 0.028867431449849323,\n \"acc_norm\": 0.7843137254901961,\n \"acc_norm_stderr\": 0.028867431449849323\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7679324894514767,\n \"acc_stderr\": 0.027479744550808503,\n \"acc_norm\": 0.7679324894514767,\n \"acc_norm_stderr\": 0.027479744550808503\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.672645739910314,\n \"acc_stderr\": 0.03149384670994131,\n \"acc_norm\": 0.672645739910314,\n \"acc_norm_stderr\": 0.03149384670994131\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7862595419847328,\n \"acc_stderr\": 0.0359546161177469,\n \"acc_norm\": 0.7862595419847328,\n \"acc_norm_stderr\": 0.0359546161177469\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7520661157024794,\n \"acc_stderr\": 0.03941897526516302,\n \"acc_norm\": 0.7520661157024794,\n \"acc_norm_stderr\": 0.03941897526516302\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n \"acc_stderr\": 0.04077494709252626,\n \"acc_norm\": 0.7685185185185185,\n \"acc_norm_stderr\": 0.04077494709252626\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7914110429447853,\n \"acc_stderr\": 0.031921934489347235,\n \"acc_norm\": 0.7914110429447853,\n \"acc_norm_stderr\": 0.031921934489347235\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.49107142857142855,\n \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.49107142857142855,\n \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8058252427184466,\n \"acc_stderr\": 0.03916667762822585,\n \"acc_norm\": 0.8058252427184466,\n \"acc_norm_stderr\": 0.03916667762822585\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8547008547008547,\n \"acc_stderr\": 0.0230866350868414,\n \"acc_norm\": 0.8547008547008547,\n \"acc_norm_stderr\": 0.0230866350868414\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7994891443167306,\n \"acc_stderr\": 0.014317653708594204,\n \"acc_norm\": 0.7994891443167306,\n \"acc_norm_stderr\": 0.014317653708594204\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7225433526011561,\n \"acc_stderr\": 0.024105712607754307,\n \"acc_norm\": 0.7225433526011561,\n \"acc_norm_stderr\": 0.024105712607754307\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.38212290502793295,\n \"acc_stderr\": 0.016251139711570772,\n \"acc_norm\": 0.38212290502793295,\n \"acc_norm_stderr\": 0.016251139711570772\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7189542483660131,\n \"acc_stderr\": 0.02573885479781872,\n \"acc_norm\": 0.7189542483660131,\n \"acc_norm_stderr\": 0.02573885479781872\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6977491961414791,\n \"acc_stderr\": 0.026082700695399665,\n \"acc_norm\": 0.6977491961414791,\n \"acc_norm_stderr\": 0.026082700695399665\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7160493827160493,\n \"acc_stderr\": 0.02508947852376513,\n \"acc_norm\": 0.7160493827160493,\n \"acc_norm_stderr\": 0.02508947852376513\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4432624113475177,\n \"acc_stderr\": 0.029634838473766006,\n \"acc_norm\": 0.4432624113475177,\n \"acc_norm_stderr\": 0.029634838473766006\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.439374185136897,\n \"acc_stderr\": 0.012676014778580215,\n \"acc_norm\": 0.439374185136897,\n \"acc_norm_stderr\": 0.012676014778580215\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6911764705882353,\n \"acc_stderr\": 0.02806499816704009,\n \"acc_norm\": 0.6911764705882353,\n \"acc_norm_stderr\": 0.02806499816704009\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6584967320261438,\n \"acc_stderr\": 0.01918463932809249,\n \"acc_norm\": 0.6584967320261438,\n \"acc_norm_stderr\": 0.01918463932809249\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6363636363636364,\n \"acc_stderr\": 0.04607582090719976,\n \"acc_norm\": 0.6363636363636364,\n \"acc_norm_stderr\": 0.04607582090719976\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.6816326530612244,\n \"acc_stderr\": 0.029822533793982062,\n \"acc_norm\": 0.6816326530612244,\n \"acc_norm_stderr\": 0.029822533793982062\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7960199004975125,\n \"acc_stderr\": 0.02849317624532607,\n \"acc_norm\": 0.7960199004975125,\n \"acc_norm_stderr\": 0.02849317624532607\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.85,\n \"acc_stderr\": 0.03588702812826371,\n \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.03588702812826371\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5542168674698795,\n \"acc_stderr\": 0.038695433234721015,\n \"acc_norm\": 0.5542168674698795,\n \"acc_norm_stderr\": 0.038695433234721015\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.847953216374269,\n \"acc_stderr\": 0.02753912288906145,\n \"acc_norm\": 0.847953216374269,\n \"acc_norm_stderr\": 0.02753912288906145\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.39412484700122397,\n \"mc1_stderr\": 0.017106588140700325,\n \"mc2\": 0.554347972654007,\n \"mc2_stderr\": 0.01584708837699472\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7955801104972375,\n \"acc_stderr\": 0.011334090612597212\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.3032600454890068,\n \"acc_stderr\": 0.012661502663418698\n }\n}\n```", "repo_url": "https://huggingface.co/abhishekchohan/mistral-7B-forest-dpo", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_01T20_33_21.801707", "path": ["**/details_harness|arc:challenge|25_2024-02-01T20-33-21.801707.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-01T20-33-21.801707.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_01T20_33_21.801707", "path": ["**/details_harness|gsm8k|5_2024-02-01T20-33-21.801707.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-01T20-33-21.801707.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_01T20_33_21.801707", "path": ["**/details_harness|hellaswag|10_2024-02-01T20-33-21.801707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-01T20-33-21.801707.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_01T20_33_21.801707", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T20-33-21.801707.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-01T20-33-21.801707.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-01T20-33-21.801707.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T20-33-21.801707.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T20-33-21.801707.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-01T20-33-21.801707.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T20-33-21.801707.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T20-33-21.801707.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T20-33-21.801707.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T20-33-21.801707.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-01T20-33-21.801707.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-01T20-33-21.801707.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T20-33-21.801707.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-01T20-33-21.801707.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T20-33-21.801707.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T20-33-21.801707.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T20-33-21.801707.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-01T20-33-21.801707.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T20-33-21.801707.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T20-33-21.801707.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T20-33-21.801707.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T20-33-21.801707.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T20-33-21.801707.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T20-33-21.801707.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T20-33-21.801707.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T20-33-21.801707.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T20-33-21.801707.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T20-33-21.801707.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T20-33-21.801707.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T20-33-21.801707.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T20-33-21.801707.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T20-33-21.801707.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-01T20-33-21.801707.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T20-33-21.801707.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-01T20-33-21.801707.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T20-33-21.801707.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T20-33-21.801707.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T20-33-21.801707.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-01T20-33-21.801707.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-01T20-33-21.801707.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T20-33-21.801707.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T20-33-21.801707.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T20-33-21.801707.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T20-33-21.801707.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-01T20-33-21.801707.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-01T20-33-21.801707.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-01T20-33-21.801707.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T20-33-21.801707.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-01T20-33-21.801707.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T20-33-21.801707.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T20-33-21.801707.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-01T20-33-21.801707.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-01T20-33-21.801707.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-01T20-33-21.801707.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T20-33-21.801707.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-01T20-33-21.801707.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-01T20-33-21.801707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T20-33-21.801707.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-01T20-33-21.801707.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-01T20-33-21.801707.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T20-33-21.801707.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T20-33-21.801707.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-01T20-33-21.801707.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T20-33-21.801707.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T20-33-21.801707.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T20-33-21.801707.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T20-33-21.801707.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-01T20-33-21.801707.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-01T20-33-21.801707.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T20-33-21.801707.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-01T20-33-21.801707.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T20-33-21.801707.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T20-33-21.801707.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T20-33-21.801707.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-01T20-33-21.801707.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T20-33-21.801707.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T20-33-21.801707.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T20-33-21.801707.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T20-33-21.801707.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T20-33-21.801707.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T20-33-21.801707.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T20-33-21.801707.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T20-33-21.801707.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T20-33-21.801707.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T20-33-21.801707.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T20-33-21.801707.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T20-33-21.801707.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T20-33-21.801707.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T20-33-21.801707.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-01T20-33-21.801707.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T20-33-21.801707.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-01T20-33-21.801707.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T20-33-21.801707.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T20-33-21.801707.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T20-33-21.801707.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-01T20-33-21.801707.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-01T20-33-21.801707.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T20-33-21.801707.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T20-33-21.801707.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T20-33-21.801707.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T20-33-21.801707.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-01T20-33-21.801707.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-01T20-33-21.801707.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-01T20-33-21.801707.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T20-33-21.801707.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-01T20-33-21.801707.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T20-33-21.801707.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T20-33-21.801707.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-01T20-33-21.801707.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-01T20-33-21.801707.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-01T20-33-21.801707.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T20-33-21.801707.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-01T20-33-21.801707.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-01T20-33-21.801707.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_01T20_33_21.801707", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T20-33-21.801707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T20-33-21.801707.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_01T20_33_21.801707", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-01T20-33-21.801707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-01T20-33-21.801707.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_01T20_33_21.801707", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-01T20-33-21.801707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-01T20-33-21.801707.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_01T20_33_21.801707", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T20-33-21.801707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T20-33-21.801707.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_01T20_33_21.801707", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T20-33-21.801707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T20-33-21.801707.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_01T20_33_21.801707", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-01T20-33-21.801707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-01T20-33-21.801707.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_01T20_33_21.801707", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T20-33-21.801707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T20-33-21.801707.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_01T20_33_21.801707", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T20-33-21.801707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T20-33-21.801707.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_01T20_33_21.801707", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T20-33-21.801707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T20-33-21.801707.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_01T20_33_21.801707", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T20-33-21.801707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T20-33-21.801707.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_01T20_33_21.801707", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-01T20-33-21.801707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-01T20-33-21.801707.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_01T20_33_21.801707", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-01T20-33-21.801707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-01T20-33-21.801707.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_01T20_33_21.801707", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T20-33-21.801707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T20-33-21.801707.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_01T20_33_21.801707", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-01T20-33-21.801707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-01T20-33-21.801707.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_01T20_33_21.801707", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T20-33-21.801707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T20-33-21.801707.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_01T20_33_21.801707", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T20-33-21.801707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T20-33-21.801707.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_01T20_33_21.801707", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T20-33-21.801707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T20-33-21.801707.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_01T20_33_21.801707", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-01T20-33-21.801707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-01T20-33-21.801707.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_01T20_33_21.801707", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T20-33-21.801707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T20-33-21.801707.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_01T20_33_21.801707", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T20-33-21.801707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T20-33-21.801707.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_01T20_33_21.801707", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T20-33-21.801707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T20-33-21.801707.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_01T20_33_21.801707", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T20-33-21.801707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T20-33-21.801707.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_01T20_33_21.801707", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T20-33-21.801707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T20-33-21.801707.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_01T20_33_21.801707", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T20-33-21.801707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T20-33-21.801707.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_01T20_33_21.801707", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T20-33-21.801707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T20-33-21.801707.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_01T20_33_21.801707", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T20-33-21.801707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T20-33-21.801707.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_01T20_33_21.801707", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T20-33-21.801707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T20-33-21.801707.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_01T20_33_21.801707", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T20-33-21.801707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T20-33-21.801707.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_01T20_33_21.801707", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T20-33-21.801707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T20-33-21.801707.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_01T20_33_21.801707", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T20-33-21.801707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T20-33-21.801707.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_01T20_33_21.801707", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T20-33-21.801707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T20-33-21.801707.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_01T20_33_21.801707", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T20-33-21.801707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T20-33-21.801707.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_01T20_33_21.801707", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-01T20-33-21.801707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-01T20-33-21.801707.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_01T20_33_21.801707", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T20-33-21.801707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T20-33-21.801707.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_01T20_33_21.801707", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-01T20-33-21.801707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-01T20-33-21.801707.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_01T20_33_21.801707", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T20-33-21.801707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T20-33-21.801707.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_01T20_33_21.801707", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T20-33-21.801707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T20-33-21.801707.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_01T20_33_21.801707", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T20-33-21.801707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T20-33-21.801707.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_01T20_33_21.801707", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-01T20-33-21.801707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-01T20-33-21.801707.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_01T20_33_21.801707", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-01T20-33-21.801707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-01T20-33-21.801707.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_01T20_33_21.801707", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T20-33-21.801707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T20-33-21.801707.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_01T20_33_21.801707", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T20-33-21.801707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T20-33-21.801707.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_01T20_33_21.801707", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T20-33-21.801707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T20-33-21.801707.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_01T20_33_21.801707", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T20-33-21.801707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T20-33-21.801707.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_01T20_33_21.801707", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-01T20-33-21.801707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-01T20-33-21.801707.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_01T20_33_21.801707", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-01T20-33-21.801707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-01T20-33-21.801707.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_01T20_33_21.801707", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-01T20-33-21.801707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-01T20-33-21.801707.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_01T20_33_21.801707", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T20-33-21.801707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T20-33-21.801707.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_01T20_33_21.801707", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-01T20-33-21.801707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-01T20-33-21.801707.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_01T20_33_21.801707", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T20-33-21.801707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T20-33-21.801707.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_01T20_33_21.801707", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T20-33-21.801707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T20-33-21.801707.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_01T20_33_21.801707", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-01T20-33-21.801707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-01T20-33-21.801707.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_01T20_33_21.801707", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-01T20-33-21.801707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-01T20-33-21.801707.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_01T20_33_21.801707", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-01T20-33-21.801707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-01T20-33-21.801707.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_01T20_33_21.801707", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T20-33-21.801707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T20-33-21.801707.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_01T20_33_21.801707", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-01T20-33-21.801707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-01T20-33-21.801707.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_01T20_33_21.801707", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-01T20-33-21.801707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-01T20-33-21.801707.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_01T20_33_21.801707", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-01T20-33-21.801707.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-01T20-33-21.801707.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_01T20_33_21.801707", "path": ["**/details_harness|winogrande|5_2024-02-01T20-33-21.801707.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-01T20-33-21.801707.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_01T20_33_21.801707", "path": ["results_2024-02-01T20-33-21.801707.parquet"]}, {"split": "latest", "path": ["results_2024-02-01T20-33-21.801707.parquet"]}]}]}
2024-02-01T20:36:10+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of abhishekchohan/mistral-7B-forest-dpo Dataset automatically created during the evaluation run of model abhishekchohan/mistral-7B-forest-dpo on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-02-01T20:33:21.801707(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of abhishekchohan/mistral-7B-forest-dpo\n\n\n\nDataset automatically created during the evaluation run of model abhishekchohan/mistral-7B-forest-dpo on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-02-01T20:33:21.801707(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of abhishekchohan/mistral-7B-forest-dpo\n\n\n\nDataset automatically created during the evaluation run of model abhishekchohan/mistral-7B-forest-dpo on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-02-01T20:33:21.801707(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
0c9c9606a5908193014099008a563b5cd9331895
# Dataset Card for Evaluation run of ibivibiv/aegolius-acadicus-34b-v3 <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [ibivibiv/aegolius-acadicus-34b-v3](https://huggingface.co/ibivibiv/aegolius-acadicus-34b-v3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_ibivibiv__aegolius-acadicus-34b-v3", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-02-01T20:36:43.405455](https://huggingface.co/datasets/open-llm-leaderboard/details_ibivibiv__aegolius-acadicus-34b-v3/blob/main/results_2024-02-01T20-36-43.405455.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6235690414004832, "acc_stderr": 0.03283989975799449, "acc_norm": 0.6262345987021161, "acc_norm_stderr": 0.03349719933623391, "mc1": 0.46878824969400246, "mc1_stderr": 0.017469364874577533, "mc2": 0.6333324983158626, "mc2_stderr": 0.015558097913908223 }, "harness|arc:challenge|25": { "acc": 0.6254266211604096, "acc_stderr": 0.014144193471893456, "acc_norm": 0.6766211604095563, "acc_norm_stderr": 0.013669421630012132 }, "harness|hellaswag|10": { "acc": 0.6674965146385182, "acc_stderr": 0.004701474865207028, "acc_norm": 0.8554072893845848, "acc_norm_stderr": 0.003509709647791844 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.34, "acc_stderr": 0.04760952285695235, "acc_norm": 0.34, "acc_norm_stderr": 0.04760952285695235 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.5481481481481482, "acc_stderr": 0.04299268905480864, "acc_norm": 0.5481481481481482, "acc_norm_stderr": 0.04299268905480864 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.6907894736842105, "acc_stderr": 0.037610708698674805, "acc_norm": 0.6907894736842105, "acc_norm_stderr": 0.037610708698674805 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.58, "acc_stderr": 0.049604496374885836, "acc_norm": 0.58, "acc_norm_stderr": 0.049604496374885836 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.6943396226415094, "acc_stderr": 0.028353298073322666, "acc_norm": 0.6943396226415094, "acc_norm_stderr": 0.028353298073322666 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.7222222222222222, "acc_stderr": 0.03745554791462456, "acc_norm": 0.7222222222222222, "acc_norm_stderr": 0.03745554791462456 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.51, "acc_stderr": 0.05024183937956912, "acc_norm": 0.51, "acc_norm_stderr": 0.05024183937956912 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.51, "acc_stderr": 0.05024183937956911, "acc_norm": 0.51, "acc_norm_stderr": 0.05024183937956911 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.32, "acc_stderr": 0.04688261722621504, "acc_norm": 0.32, "acc_norm_stderr": 0.04688261722621504 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6069364161849711, "acc_stderr": 0.0372424959581773, "acc_norm": 0.6069364161849711, "acc_norm_stderr": 0.0372424959581773 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.3431372549019608, "acc_stderr": 0.04724007352383888, "acc_norm": 0.3431372549019608, "acc_norm_stderr": 0.04724007352383888 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.73, "acc_stderr": 0.044619604333847394, "acc_norm": 0.73, "acc_norm_stderr": 0.044619604333847394 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.5148936170212766, "acc_stderr": 0.03267151848924777, "acc_norm": 0.5148936170212766, "acc_norm_stderr": 0.03267151848924777 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.4298245614035088, "acc_stderr": 0.04657047260594964, "acc_norm": 0.4298245614035088, "acc_norm_stderr": 0.04657047260594964 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5862068965517241, "acc_stderr": 0.04104269211806232, "acc_norm": 0.5862068965517241, "acc_norm_stderr": 0.04104269211806232 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.42328042328042326, "acc_stderr": 0.025446365634406783, "acc_norm": 0.42328042328042326, "acc_norm_stderr": 0.025446365634406783 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.4126984126984127, "acc_stderr": 0.04403438954768176, "acc_norm": 0.4126984126984127, "acc_norm_stderr": 0.04403438954768176 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.4, "acc_stderr": 0.04923659639173309, "acc_norm": 0.4, "acc_norm_stderr": 0.04923659639173309 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.603225806451613, "acc_stderr": 0.02783123160576794, "acc_norm": 0.603225806451613, "acc_norm_stderr": 0.02783123160576794 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.4975369458128079, "acc_stderr": 0.03517945038691063, "acc_norm": 0.4975369458128079, "acc_norm_stderr": 0.03517945038691063 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.65, "acc_stderr": 0.047937248544110196, "acc_norm": 0.65, "acc_norm_stderr": 0.047937248544110196 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7818181818181819, "acc_stderr": 0.03225078108306289, "acc_norm": 0.7818181818181819, "acc_norm_stderr": 0.03225078108306289 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.7575757575757576, "acc_stderr": 0.030532892233932022, "acc_norm": 0.7575757575757576, "acc_norm_stderr": 0.030532892233932022 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.8601036269430051, "acc_stderr": 0.025033870583015178, "acc_norm": 0.8601036269430051, "acc_norm_stderr": 0.025033870583015178 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.6128205128205129, "acc_stderr": 0.024697216930878937, "acc_norm": 0.6128205128205129, "acc_norm_stderr": 0.024697216930878937 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.337037037037037, "acc_stderr": 0.028820884666253255, "acc_norm": 0.337037037037037, "acc_norm_stderr": 0.028820884666253255 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.6428571428571429, "acc_stderr": 0.031124619309328177, "acc_norm": 0.6428571428571429, "acc_norm_stderr": 0.031124619309328177 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.3841059602649007, "acc_stderr": 0.03971301814719197, "acc_norm": 0.3841059602649007, "acc_norm_stderr": 0.03971301814719197 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8055045871559633, "acc_stderr": 0.01697028909045803, "acc_norm": 0.8055045871559633, "acc_norm_stderr": 0.01697028909045803 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.4305555555555556, "acc_stderr": 0.03376922151252336, "acc_norm": 0.4305555555555556, "acc_norm_stderr": 0.03376922151252336 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.7941176470588235, "acc_stderr": 0.028379449451588667, "acc_norm": 0.7941176470588235, "acc_norm_stderr": 0.028379449451588667 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.7890295358649789, "acc_stderr": 0.02655837250266192, "acc_norm": 0.7890295358649789, "acc_norm_stderr": 0.02655837250266192 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6591928251121076, "acc_stderr": 0.031811497470553604, "acc_norm": 0.6591928251121076, "acc_norm_stderr": 0.031811497470553604 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.7786259541984732, "acc_stderr": 0.0364129708131373, "acc_norm": 0.7786259541984732, "acc_norm_stderr": 0.0364129708131373 }, "harness|hendrycksTest-international_law|5": { "acc": 0.8264462809917356, "acc_stderr": 0.03457272836917669, "acc_norm": 0.8264462809917356, "acc_norm_stderr": 0.03457272836917669 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7592592592592593, "acc_stderr": 0.04133119440243839, "acc_norm": 0.7592592592592593, "acc_norm_stderr": 0.04133119440243839 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7055214723926381, "acc_stderr": 0.03581165790474082, "acc_norm": 0.7055214723926381, "acc_norm_stderr": 0.03581165790474082 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.4732142857142857, "acc_stderr": 0.047389751192741546, "acc_norm": 0.4732142857142857, "acc_norm_stderr": 0.047389751192741546 }, "harness|hendrycksTest-management|5": { "acc": 0.7378640776699029, "acc_stderr": 0.04354631077260595, "acc_norm": 0.7378640776699029, "acc_norm_stderr": 0.04354631077260595 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8888888888888888, "acc_stderr": 0.020588491316092368, "acc_norm": 0.8888888888888888, "acc_norm_stderr": 0.020588491316092368 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.72, "acc_stderr": 0.045126085985421276, "acc_norm": 0.72, "acc_norm_stderr": 0.045126085985421276 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8045977011494253, "acc_stderr": 0.014179171373424384, "acc_norm": 0.8045977011494253, "acc_norm_stderr": 0.014179171373424384 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.7254335260115607, "acc_stderr": 0.02402774515526502, "acc_norm": 0.7254335260115607, "acc_norm_stderr": 0.02402774515526502 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.45027932960893857, "acc_stderr": 0.016639615236845807, "acc_norm": 0.45027932960893857, "acc_norm_stderr": 0.016639615236845807 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.7026143790849673, "acc_stderr": 0.02617390850671858, "acc_norm": 0.7026143790849673, "acc_norm_stderr": 0.02617390850671858 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.6945337620578779, "acc_stderr": 0.026160584450140446, "acc_norm": 0.6945337620578779, "acc_norm_stderr": 0.026160584450140446 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.7191358024691358, "acc_stderr": 0.025006469755799208, "acc_norm": 0.7191358024691358, "acc_norm_stderr": 0.025006469755799208 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.4858156028368794, "acc_stderr": 0.02981549448368206, "acc_norm": 0.4858156028368794, "acc_norm_stderr": 0.02981549448368206 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.4621903520208605, "acc_stderr": 0.012733671880342506, "acc_norm": 0.4621903520208605, "acc_norm_stderr": 0.012733671880342506 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.6507352941176471, "acc_stderr": 0.028959755196824866, "acc_norm": 0.6507352941176471, "acc_norm_stderr": 0.028959755196824866 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6584967320261438, "acc_stderr": 0.019184639328092487, "acc_norm": 0.6584967320261438, "acc_norm_stderr": 0.019184639328092487 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6818181818181818, "acc_stderr": 0.044612721759105085, "acc_norm": 0.6818181818181818, "acc_norm_stderr": 0.044612721759105085 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.7142857142857143, "acc_stderr": 0.0289205832206756, "acc_norm": 0.7142857142857143, "acc_norm_stderr": 0.0289205832206756 }, "harness|hendrycksTest-sociology|5": { "acc": 0.6318407960199005, "acc_stderr": 0.034104105654953025, "acc_norm": 0.6318407960199005, "acc_norm_stderr": 0.034104105654953025 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.84, "acc_stderr": 0.03684529491774707, "acc_norm": 0.84, "acc_norm_stderr": 0.03684529491774707 }, "harness|hendrycksTest-virology|5": { "acc": 0.5, "acc_stderr": 0.03892494720807614, "acc_norm": 0.5, "acc_norm_stderr": 0.03892494720807614 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8362573099415205, "acc_stderr": 0.028380919596145866, "acc_norm": 0.8362573099415205, "acc_norm_stderr": 0.028380919596145866 }, "harness|truthfulqa:mc|0": { "mc1": 0.46878824969400246, "mc1_stderr": 0.017469364874577533, "mc2": 0.6333324983158626, "mc2_stderr": 0.015558097913908223 }, "harness|winogrande|5": { "acc": 0.7868981846882399, "acc_stderr": 0.011508957690722764 }, "harness|gsm8k|5": { "acc": 0.5420773313115997, "acc_stderr": 0.013723629649844082 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_ibivibiv__aegolius-acadicus-34b-v3
[ "region:us" ]
2024-02-01T20:39:06+00:00
{"pretty_name": "Evaluation run of ibivibiv/aegolius-acadicus-34b-v3", "dataset_summary": "Dataset automatically created during the evaluation run of model [ibivibiv/aegolius-acadicus-34b-v3](https://huggingface.co/ibivibiv/aegolius-acadicus-34b-v3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_ibivibiv__aegolius-acadicus-34b-v3\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-01T20:36:43.405455](https://huggingface.co/datasets/open-llm-leaderboard/details_ibivibiv__aegolius-acadicus-34b-v3/blob/main/results_2024-02-01T20-36-43.405455.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6235690414004832,\n \"acc_stderr\": 0.03283989975799449,\n \"acc_norm\": 0.6262345987021161,\n \"acc_norm_stderr\": 0.03349719933623391,\n \"mc1\": 0.46878824969400246,\n \"mc1_stderr\": 0.017469364874577533,\n \"mc2\": 0.6333324983158626,\n \"mc2_stderr\": 0.015558097913908223\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6254266211604096,\n \"acc_stderr\": 0.014144193471893456,\n \"acc_norm\": 0.6766211604095563,\n \"acc_norm_stderr\": 0.013669421630012132\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6674965146385182,\n \"acc_stderr\": 0.004701474865207028,\n \"acc_norm\": 0.8554072893845848,\n \"acc_norm_stderr\": 0.003509709647791844\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5481481481481482,\n \"acc_stderr\": 0.04299268905480864,\n \"acc_norm\": 0.5481481481481482,\n \"acc_norm_stderr\": 0.04299268905480864\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6907894736842105,\n \"acc_stderr\": 0.037610708698674805,\n \"acc_norm\": 0.6907894736842105,\n \"acc_norm_stderr\": 0.037610708698674805\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6943396226415094,\n \"acc_stderr\": 0.028353298073322666,\n \"acc_norm\": 0.6943396226415094,\n \"acc_norm_stderr\": 0.028353298073322666\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.03745554791462456,\n \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.03745554791462456\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956911,\n \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956911\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6069364161849711,\n \"acc_stderr\": 0.0372424959581773,\n \"acc_norm\": 0.6069364161849711,\n \"acc_norm_stderr\": 0.0372424959581773\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.3431372549019608,\n \"acc_stderr\": 0.04724007352383888,\n \"acc_norm\": 0.3431372549019608,\n \"acc_norm_stderr\": 0.04724007352383888\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5148936170212766,\n \"acc_stderr\": 0.03267151848924777,\n \"acc_norm\": 0.5148936170212766,\n \"acc_norm_stderr\": 0.03267151848924777\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4298245614035088,\n \"acc_stderr\": 0.04657047260594964,\n \"acc_norm\": 0.4298245614035088,\n \"acc_norm_stderr\": 0.04657047260594964\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5862068965517241,\n \"acc_stderr\": 0.04104269211806232,\n \"acc_norm\": 0.5862068965517241,\n \"acc_norm_stderr\": 0.04104269211806232\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.42328042328042326,\n \"acc_stderr\": 0.025446365634406783,\n \"acc_norm\": 0.42328042328042326,\n \"acc_norm_stderr\": 0.025446365634406783\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4126984126984127,\n \"acc_stderr\": 0.04403438954768176,\n \"acc_norm\": 0.4126984126984127,\n \"acc_norm_stderr\": 0.04403438954768176\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.4,\n \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.603225806451613,\n \"acc_stderr\": 0.02783123160576794,\n \"acc_norm\": 0.603225806451613,\n \"acc_norm_stderr\": 0.02783123160576794\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4975369458128079,\n \"acc_stderr\": 0.03517945038691063,\n \"acc_norm\": 0.4975369458128079,\n \"acc_norm_stderr\": 0.03517945038691063\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.65,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7818181818181819,\n \"acc_stderr\": 0.03225078108306289,\n \"acc_norm\": 0.7818181818181819,\n \"acc_norm_stderr\": 0.03225078108306289\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7575757575757576,\n \"acc_stderr\": 0.030532892233932022,\n \"acc_norm\": 0.7575757575757576,\n \"acc_norm_stderr\": 0.030532892233932022\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8601036269430051,\n \"acc_stderr\": 0.025033870583015178,\n \"acc_norm\": 0.8601036269430051,\n \"acc_norm_stderr\": 0.025033870583015178\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6128205128205129,\n \"acc_stderr\": 0.024697216930878937,\n \"acc_norm\": 0.6128205128205129,\n \"acc_norm_stderr\": 0.024697216930878937\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.337037037037037,\n \"acc_stderr\": 0.028820884666253255,\n \"acc_norm\": 0.337037037037037,\n \"acc_norm_stderr\": 0.028820884666253255\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6428571428571429,\n \"acc_stderr\": 0.031124619309328177,\n \"acc_norm\": 0.6428571428571429,\n \"acc_norm_stderr\": 0.031124619309328177\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3841059602649007,\n \"acc_stderr\": 0.03971301814719197,\n \"acc_norm\": 0.3841059602649007,\n \"acc_norm_stderr\": 0.03971301814719197\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8055045871559633,\n \"acc_stderr\": 0.01697028909045803,\n \"acc_norm\": 0.8055045871559633,\n \"acc_norm_stderr\": 0.01697028909045803\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4305555555555556,\n \"acc_stderr\": 0.03376922151252336,\n \"acc_norm\": 0.4305555555555556,\n \"acc_norm_stderr\": 0.03376922151252336\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7941176470588235,\n \"acc_stderr\": 0.028379449451588667,\n \"acc_norm\": 0.7941176470588235,\n \"acc_norm_stderr\": 0.028379449451588667\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7890295358649789,\n \"acc_stderr\": 0.02655837250266192,\n \"acc_norm\": 0.7890295358649789,\n \"acc_norm_stderr\": 0.02655837250266192\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6591928251121076,\n \"acc_stderr\": 0.031811497470553604,\n \"acc_norm\": 0.6591928251121076,\n \"acc_norm_stderr\": 0.031811497470553604\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7786259541984732,\n \"acc_stderr\": 0.0364129708131373,\n \"acc_norm\": 0.7786259541984732,\n \"acc_norm_stderr\": 0.0364129708131373\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8264462809917356,\n \"acc_stderr\": 0.03457272836917669,\n \"acc_norm\": 0.8264462809917356,\n \"acc_norm_stderr\": 0.03457272836917669\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7592592592592593,\n \"acc_stderr\": 0.04133119440243839,\n \"acc_norm\": 0.7592592592592593,\n \"acc_norm_stderr\": 0.04133119440243839\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7055214723926381,\n \"acc_stderr\": 0.03581165790474082,\n \"acc_norm\": 0.7055214723926381,\n \"acc_norm_stderr\": 0.03581165790474082\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4732142857142857,\n \"acc_stderr\": 0.047389751192741546,\n \"acc_norm\": 0.4732142857142857,\n \"acc_norm_stderr\": 0.047389751192741546\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7378640776699029,\n \"acc_stderr\": 0.04354631077260595,\n \"acc_norm\": 0.7378640776699029,\n \"acc_norm_stderr\": 0.04354631077260595\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8888888888888888,\n \"acc_stderr\": 0.020588491316092368,\n \"acc_norm\": 0.8888888888888888,\n \"acc_norm_stderr\": 0.020588491316092368\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8045977011494253,\n \"acc_stderr\": 0.014179171373424384,\n \"acc_norm\": 0.8045977011494253,\n \"acc_norm_stderr\": 0.014179171373424384\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7254335260115607,\n \"acc_stderr\": 0.02402774515526502,\n \"acc_norm\": 0.7254335260115607,\n \"acc_norm_stderr\": 0.02402774515526502\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.45027932960893857,\n \"acc_stderr\": 0.016639615236845807,\n \"acc_norm\": 0.45027932960893857,\n \"acc_norm_stderr\": 0.016639615236845807\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7026143790849673,\n \"acc_stderr\": 0.02617390850671858,\n \"acc_norm\": 0.7026143790849673,\n \"acc_norm_stderr\": 0.02617390850671858\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6945337620578779,\n \"acc_stderr\": 0.026160584450140446,\n \"acc_norm\": 0.6945337620578779,\n \"acc_norm_stderr\": 0.026160584450140446\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7191358024691358,\n \"acc_stderr\": 0.025006469755799208,\n \"acc_norm\": 0.7191358024691358,\n \"acc_norm_stderr\": 0.025006469755799208\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4858156028368794,\n \"acc_stderr\": 0.02981549448368206,\n \"acc_norm\": 0.4858156028368794,\n \"acc_norm_stderr\": 0.02981549448368206\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4621903520208605,\n \"acc_stderr\": 0.012733671880342506,\n \"acc_norm\": 0.4621903520208605,\n \"acc_norm_stderr\": 0.012733671880342506\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6507352941176471,\n \"acc_stderr\": 0.028959755196824866,\n \"acc_norm\": 0.6507352941176471,\n \"acc_norm_stderr\": 0.028959755196824866\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6584967320261438,\n \"acc_stderr\": 0.019184639328092487,\n \"acc_norm\": 0.6584967320261438,\n \"acc_norm_stderr\": 0.019184639328092487\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n \"acc_stderr\": 0.044612721759105085,\n \"acc_norm\": 0.6818181818181818,\n \"acc_norm_stderr\": 0.044612721759105085\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7142857142857143,\n \"acc_stderr\": 0.0289205832206756,\n \"acc_norm\": 0.7142857142857143,\n \"acc_norm_stderr\": 0.0289205832206756\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6318407960199005,\n \"acc_stderr\": 0.034104105654953025,\n \"acc_norm\": 0.6318407960199005,\n \"acc_norm_stderr\": 0.034104105654953025\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774707,\n \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774707\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.03892494720807614,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.03892494720807614\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.46878824969400246,\n \"mc1_stderr\": 0.017469364874577533,\n \"mc2\": 0.6333324983158626,\n \"mc2_stderr\": 0.015558097913908223\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7868981846882399,\n \"acc_stderr\": 0.011508957690722764\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5420773313115997,\n \"acc_stderr\": 0.013723629649844082\n }\n}\n```", "repo_url": "https://huggingface.co/ibivibiv/aegolius-acadicus-34b-v3", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_01T20_36_43.405455", "path": ["**/details_harness|arc:challenge|25_2024-02-01T20-36-43.405455.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-01T20-36-43.405455.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_01T20_36_43.405455", "path": ["**/details_harness|gsm8k|5_2024-02-01T20-36-43.405455.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-01T20-36-43.405455.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_01T20_36_43.405455", "path": ["**/details_harness|hellaswag|10_2024-02-01T20-36-43.405455.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-01T20-36-43.405455.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_01T20_36_43.405455", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T20-36-43.405455.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-01T20-36-43.405455.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-01T20-36-43.405455.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T20-36-43.405455.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T20-36-43.405455.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-01T20-36-43.405455.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T20-36-43.405455.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T20-36-43.405455.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T20-36-43.405455.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T20-36-43.405455.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-01T20-36-43.405455.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-01T20-36-43.405455.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T20-36-43.405455.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-01T20-36-43.405455.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T20-36-43.405455.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T20-36-43.405455.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T20-36-43.405455.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-01T20-36-43.405455.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T20-36-43.405455.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T20-36-43.405455.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T20-36-43.405455.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T20-36-43.405455.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T20-36-43.405455.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T20-36-43.405455.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T20-36-43.405455.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T20-36-43.405455.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T20-36-43.405455.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T20-36-43.405455.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T20-36-43.405455.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T20-36-43.405455.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T20-36-43.405455.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T20-36-43.405455.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-01T20-36-43.405455.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T20-36-43.405455.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-01T20-36-43.405455.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T20-36-43.405455.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T20-36-43.405455.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T20-36-43.405455.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-01T20-36-43.405455.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-01T20-36-43.405455.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T20-36-43.405455.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T20-36-43.405455.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T20-36-43.405455.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T20-36-43.405455.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-01T20-36-43.405455.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-01T20-36-43.405455.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-01T20-36-43.405455.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T20-36-43.405455.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-01T20-36-43.405455.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T20-36-43.405455.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T20-36-43.405455.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-01T20-36-43.405455.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-01T20-36-43.405455.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-01T20-36-43.405455.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T20-36-43.405455.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-01T20-36-43.405455.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-01T20-36-43.405455.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T20-36-43.405455.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-01T20-36-43.405455.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-01T20-36-43.405455.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T20-36-43.405455.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T20-36-43.405455.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-01T20-36-43.405455.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T20-36-43.405455.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T20-36-43.405455.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T20-36-43.405455.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T20-36-43.405455.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-01T20-36-43.405455.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-01T20-36-43.405455.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T20-36-43.405455.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-01T20-36-43.405455.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T20-36-43.405455.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T20-36-43.405455.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T20-36-43.405455.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-01T20-36-43.405455.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T20-36-43.405455.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T20-36-43.405455.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T20-36-43.405455.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T20-36-43.405455.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T20-36-43.405455.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T20-36-43.405455.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T20-36-43.405455.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T20-36-43.405455.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T20-36-43.405455.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T20-36-43.405455.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T20-36-43.405455.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T20-36-43.405455.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T20-36-43.405455.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T20-36-43.405455.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-01T20-36-43.405455.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T20-36-43.405455.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-01T20-36-43.405455.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T20-36-43.405455.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T20-36-43.405455.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T20-36-43.405455.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-01T20-36-43.405455.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-01T20-36-43.405455.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T20-36-43.405455.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T20-36-43.405455.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T20-36-43.405455.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T20-36-43.405455.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-01T20-36-43.405455.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-01T20-36-43.405455.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-01T20-36-43.405455.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T20-36-43.405455.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-01T20-36-43.405455.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T20-36-43.405455.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T20-36-43.405455.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-01T20-36-43.405455.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-01T20-36-43.405455.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-01T20-36-43.405455.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T20-36-43.405455.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-01T20-36-43.405455.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-01T20-36-43.405455.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_01T20_36_43.405455", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T20-36-43.405455.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T20-36-43.405455.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_01T20_36_43.405455", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-01T20-36-43.405455.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-01T20-36-43.405455.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_01T20_36_43.405455", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-01T20-36-43.405455.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-01T20-36-43.405455.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_01T20_36_43.405455", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T20-36-43.405455.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T20-36-43.405455.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_01T20_36_43.405455", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T20-36-43.405455.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T20-36-43.405455.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_01T20_36_43.405455", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-01T20-36-43.405455.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-01T20-36-43.405455.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_01T20_36_43.405455", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T20-36-43.405455.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T20-36-43.405455.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_01T20_36_43.405455", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T20-36-43.405455.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T20-36-43.405455.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_01T20_36_43.405455", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T20-36-43.405455.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T20-36-43.405455.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_01T20_36_43.405455", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T20-36-43.405455.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T20-36-43.405455.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_01T20_36_43.405455", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-01T20-36-43.405455.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-01T20-36-43.405455.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_01T20_36_43.405455", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-01T20-36-43.405455.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-01T20-36-43.405455.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_01T20_36_43.405455", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T20-36-43.405455.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T20-36-43.405455.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_01T20_36_43.405455", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-01T20-36-43.405455.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-01T20-36-43.405455.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_01T20_36_43.405455", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T20-36-43.405455.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T20-36-43.405455.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_01T20_36_43.405455", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T20-36-43.405455.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T20-36-43.405455.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_01T20_36_43.405455", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T20-36-43.405455.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T20-36-43.405455.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_01T20_36_43.405455", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-01T20-36-43.405455.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-01T20-36-43.405455.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_01T20_36_43.405455", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T20-36-43.405455.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T20-36-43.405455.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_01T20_36_43.405455", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T20-36-43.405455.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T20-36-43.405455.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_01T20_36_43.405455", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T20-36-43.405455.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T20-36-43.405455.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_01T20_36_43.405455", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T20-36-43.405455.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T20-36-43.405455.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_01T20_36_43.405455", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T20-36-43.405455.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T20-36-43.405455.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_01T20_36_43.405455", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T20-36-43.405455.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T20-36-43.405455.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_01T20_36_43.405455", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T20-36-43.405455.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T20-36-43.405455.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_01T20_36_43.405455", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T20-36-43.405455.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T20-36-43.405455.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_01T20_36_43.405455", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T20-36-43.405455.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T20-36-43.405455.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_01T20_36_43.405455", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T20-36-43.405455.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T20-36-43.405455.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_01T20_36_43.405455", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T20-36-43.405455.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T20-36-43.405455.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_01T20_36_43.405455", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T20-36-43.405455.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T20-36-43.405455.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_01T20_36_43.405455", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T20-36-43.405455.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T20-36-43.405455.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_01T20_36_43.405455", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T20-36-43.405455.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T20-36-43.405455.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_01T20_36_43.405455", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-01T20-36-43.405455.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-01T20-36-43.405455.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_01T20_36_43.405455", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T20-36-43.405455.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T20-36-43.405455.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_01T20_36_43.405455", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-01T20-36-43.405455.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-01T20-36-43.405455.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_01T20_36_43.405455", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T20-36-43.405455.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T20-36-43.405455.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_01T20_36_43.405455", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T20-36-43.405455.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T20-36-43.405455.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_01T20_36_43.405455", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T20-36-43.405455.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T20-36-43.405455.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_01T20_36_43.405455", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-01T20-36-43.405455.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-01T20-36-43.405455.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_01T20_36_43.405455", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-01T20-36-43.405455.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-01T20-36-43.405455.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_01T20_36_43.405455", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T20-36-43.405455.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T20-36-43.405455.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_01T20_36_43.405455", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T20-36-43.405455.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T20-36-43.405455.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_01T20_36_43.405455", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T20-36-43.405455.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T20-36-43.405455.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_01T20_36_43.405455", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T20-36-43.405455.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T20-36-43.405455.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_01T20_36_43.405455", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-01T20-36-43.405455.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-01T20-36-43.405455.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_01T20_36_43.405455", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-01T20-36-43.405455.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-01T20-36-43.405455.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_01T20_36_43.405455", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-01T20-36-43.405455.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-01T20-36-43.405455.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_01T20_36_43.405455", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T20-36-43.405455.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T20-36-43.405455.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_01T20_36_43.405455", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-01T20-36-43.405455.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-01T20-36-43.405455.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_01T20_36_43.405455", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T20-36-43.405455.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T20-36-43.405455.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_01T20_36_43.405455", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T20-36-43.405455.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T20-36-43.405455.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_01T20_36_43.405455", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-01T20-36-43.405455.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-01T20-36-43.405455.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_01T20_36_43.405455", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-01T20-36-43.405455.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-01T20-36-43.405455.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_01T20_36_43.405455", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-01T20-36-43.405455.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-01T20-36-43.405455.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_01T20_36_43.405455", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T20-36-43.405455.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T20-36-43.405455.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_01T20_36_43.405455", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-01T20-36-43.405455.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-01T20-36-43.405455.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_01T20_36_43.405455", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-01T20-36-43.405455.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-01T20-36-43.405455.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_01T20_36_43.405455", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-01T20-36-43.405455.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-01T20-36-43.405455.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_01T20_36_43.405455", "path": ["**/details_harness|winogrande|5_2024-02-01T20-36-43.405455.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-01T20-36-43.405455.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_01T20_36_43.405455", "path": ["results_2024-02-01T20-36-43.405455.parquet"]}, {"split": "latest", "path": ["results_2024-02-01T20-36-43.405455.parquet"]}]}]}
2024-02-01T20:39:30+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of ibivibiv/aegolius-acadicus-34b-v3 Dataset automatically created during the evaluation run of model ibivibiv/aegolius-acadicus-34b-v3 on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-02-01T20:36:43.405455(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of ibivibiv/aegolius-acadicus-34b-v3\n\n\n\nDataset automatically created during the evaluation run of model ibivibiv/aegolius-acadicus-34b-v3 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-02-01T20:36:43.405455(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of ibivibiv/aegolius-acadicus-34b-v3\n\n\n\nDataset automatically created during the evaluation run of model ibivibiv/aegolius-acadicus-34b-v3 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-02-01T20:36:43.405455(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
921bfc9d43628232e9a875748a004e03634c7bc9
[reddit-instruct](https://huggingface.co/datasets/euclaise/reddit-instruct) filtered for instruction/question-like properties, using [Lilac](https://www.lilacml.com/)
euclaise/reddit-instruct-curated
[ "task_categories:question-answering", "size_categories:10K<n<100K", "language:en", "license:mit", "human-data", "lilac", "region:us" ]
2024-02-01T20:47:53+00:00
{"language": ["en"], "license": "mit", "size_categories": ["10K<n<100K"], "task_categories": ["question-answering"], "dataset_info": {"features": [{"name": "post_title", "dtype": "string"}, {"name": "post_text", "dtype": "string"}, {"name": "comment_text", "dtype": "string"}, {"name": "comment_score", "dtype": "int64"}, {"name": "post_score", "dtype": "int64"}], "splits": [{"name": "train", "num_bytes": 14757454.058359765, "num_examples": 10803}, {"name": "test", "num_bytes": 316923.9416402356, "num_examples": 232}], "download_size": 9352028, "dataset_size": 15074378}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "test", "path": "data/test-*"}]}], "tags": ["human-data", "lilac"]}
2024-02-01T21:07:56+00:00
[]
[ "en" ]
TAGS #task_categories-question-answering #size_categories-10K<n<100K #language-English #license-mit #human-data #lilac #region-us
reddit-instruct filtered for instruction/question-like properties, using Lilac
[]
[ "TAGS\n#task_categories-question-answering #size_categories-10K<n<100K #language-English #license-mit #human-data #lilac #region-us \n" ]
e5cfa7ccbaa7a4f3d57178fe75bcae1a3a0c72be
# Dataset Card for Evaluation run of Weyaxi/Newton-7B <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [Weyaxi/Newton-7B](https://huggingface.co/Weyaxi/Newton-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_Weyaxi__Newton-7B", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-02-01T20:57:35.185949](https://huggingface.co/datasets/open-llm-leaderboard/details_Weyaxi__Newton-7B/blob/main/results_2024-02-01T20-57-35.185949.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6202881703209492, "acc_stderr": 0.032138997104958926, "acc_norm": 0.6312377539040873, "acc_norm_stderr": 0.0329280849118904, "mc1": 0.2876376988984088, "mc1_stderr": 0.01584631510139481, "mc2": 0.4436037082395254, "mc2_stderr": 0.015171870706558463 }, "harness|arc:challenge|25": { "acc": 0.6023890784982935, "acc_stderr": 0.01430175222327954, "acc_norm": 0.6399317406143344, "acc_norm_stderr": 0.014027516814585188 }, "harness|hellaswag|10": { "acc": 0.6266679944234216, "acc_stderr": 0.0048270065208028835, "acc_norm": 0.817167894841665, "acc_norm_stderr": 0.0038573886135331035 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.33, "acc_stderr": 0.04725815626252606, "acc_norm": 0.33, "acc_norm_stderr": 0.04725815626252606 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.5703703703703704, "acc_stderr": 0.04276349494376599, "acc_norm": 0.5703703703703704, "acc_norm_stderr": 0.04276349494376599 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.6907894736842105, "acc_stderr": 0.037610708698674805, "acc_norm": 0.6907894736842105, "acc_norm_stderr": 0.037610708698674805 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.66, "acc_stderr": 0.04760952285695237, "acc_norm": 0.66, "acc_norm_stderr": 0.04760952285695237 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.6716981132075471, "acc_stderr": 0.02890159361241178, "acc_norm": 0.6716981132075471, "acc_norm_stderr": 0.02890159361241178 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.75, "acc_stderr": 0.03621034121889507, "acc_norm": 0.75, "acc_norm_stderr": 0.03621034121889507 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.43, "acc_stderr": 0.04975698519562428, "acc_norm": 0.43, "acc_norm_stderr": 0.04975698519562428 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.52, "acc_stderr": 0.050211673156867795, "acc_norm": 0.52, "acc_norm_stderr": 0.050211673156867795 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.3, "acc_stderr": 0.046056618647183814, "acc_norm": 0.3, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6647398843930635, "acc_stderr": 0.03599586301247077, "acc_norm": 0.6647398843930635, "acc_norm_stderr": 0.03599586301247077 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.4019607843137255, "acc_stderr": 0.04878608714466996, "acc_norm": 0.4019607843137255, "acc_norm_stderr": 0.04878608714466996 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.75, "acc_stderr": 0.04351941398892446, "acc_norm": 0.75, "acc_norm_stderr": 0.04351941398892446 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.548936170212766, "acc_stderr": 0.03252909619613197, "acc_norm": 0.548936170212766, "acc_norm_stderr": 0.03252909619613197 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.49122807017543857, "acc_stderr": 0.04702880432049615, "acc_norm": 0.49122807017543857, "acc_norm_stderr": 0.04702880432049615 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5724137931034483, "acc_stderr": 0.041227371113703316, "acc_norm": 0.5724137931034483, "acc_norm_stderr": 0.041227371113703316 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.4126984126984127, "acc_stderr": 0.02535574126305527, "acc_norm": 0.4126984126984127, "acc_norm_stderr": 0.02535574126305527 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.47619047619047616, "acc_stderr": 0.04467062628403273, "acc_norm": 0.47619047619047616, "acc_norm_stderr": 0.04467062628403273 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.25, "acc_stderr": 0.04351941398892446, "acc_norm": 0.25, "acc_norm_stderr": 0.04351941398892446 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.7709677419354839, "acc_stderr": 0.023904914311782655, "acc_norm": 0.7709677419354839, "acc_norm_stderr": 0.023904914311782655 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.5172413793103449, "acc_stderr": 0.035158955511656986, "acc_norm": 0.5172413793103449, "acc_norm_stderr": 0.035158955511656986 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.64, "acc_stderr": 0.04824181513244218, "acc_norm": 0.64, "acc_norm_stderr": 0.04824181513244218 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7757575757575758, "acc_stderr": 0.032568666616811015, "acc_norm": 0.7757575757575758, "acc_norm_stderr": 0.032568666616811015 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.797979797979798, "acc_stderr": 0.02860620428922987, "acc_norm": 0.797979797979798, "acc_norm_stderr": 0.02860620428922987 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.8808290155440415, "acc_stderr": 0.023381935348121417, "acc_norm": 0.8808290155440415, "acc_norm_stderr": 0.023381935348121417 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.6230769230769231, "acc_stderr": 0.024570975364225995, "acc_norm": 0.6230769230769231, "acc_norm_stderr": 0.024570975364225995 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.3296296296296296, "acc_stderr": 0.028661201116524586, "acc_norm": 0.3296296296296296, "acc_norm_stderr": 0.028661201116524586 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.6386554621848739, "acc_stderr": 0.031204691225150016, "acc_norm": 0.6386554621848739, "acc_norm_stderr": 0.031204691225150016 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.31788079470198677, "acc_stderr": 0.038020397601079024, "acc_norm": 0.31788079470198677, "acc_norm_stderr": 0.038020397601079024 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8440366972477065, "acc_stderr": 0.015555802713590172, "acc_norm": 0.8440366972477065, "acc_norm_stderr": 0.015555802713590172 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.4675925925925926, "acc_stderr": 0.03402801581358966, "acc_norm": 0.4675925925925926, "acc_norm_stderr": 0.03402801581358966 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.8088235294117647, "acc_stderr": 0.027599174300640766, "acc_norm": 0.8088235294117647, "acc_norm_stderr": 0.027599174300640766 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.7763713080168776, "acc_stderr": 0.027123298205229966, "acc_norm": 0.7763713080168776, "acc_norm_stderr": 0.027123298205229966 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.7130044843049327, "acc_stderr": 0.030360379710291947, "acc_norm": 0.7130044843049327, "acc_norm_stderr": 0.030360379710291947 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.7786259541984732, "acc_stderr": 0.03641297081313728, "acc_norm": 0.7786259541984732, "acc_norm_stderr": 0.03641297081313728 }, "harness|hendrycksTest-international_law|5": { "acc": 0.7933884297520661, "acc_stderr": 0.03695980128098824, "acc_norm": 0.7933884297520661, "acc_norm_stderr": 0.03695980128098824 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.75, "acc_stderr": 0.04186091791394607, "acc_norm": 0.75, "acc_norm_stderr": 0.04186091791394607 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7361963190184049, "acc_stderr": 0.03462419931615623, "acc_norm": 0.7361963190184049, "acc_norm_stderr": 0.03462419931615623 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.3392857142857143, "acc_stderr": 0.04493949068613539, "acc_norm": 0.3392857142857143, "acc_norm_stderr": 0.04493949068613539 }, "harness|hendrycksTest-management|5": { "acc": 0.7669902912621359, "acc_stderr": 0.04185832598928315, "acc_norm": 0.7669902912621359, "acc_norm_stderr": 0.04185832598928315 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8974358974358975, "acc_stderr": 0.01987565502786744, "acc_norm": 0.8974358974358975, "acc_norm_stderr": 0.01987565502786744 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.76, "acc_stderr": 0.042923469599092816, "acc_norm": 0.76, "acc_norm_stderr": 0.042923469599092816 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8237547892720306, "acc_stderr": 0.013625556907993457, "acc_norm": 0.8237547892720306, "acc_norm_stderr": 0.013625556907993457 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.7283236994219653, "acc_stderr": 0.023948512905468365, "acc_norm": 0.7283236994219653, "acc_norm_stderr": 0.023948512905468365 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.30837988826815643, "acc_stderr": 0.015445716910998884, "acc_norm": 0.30837988826815643, "acc_norm_stderr": 0.015445716910998884 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.6993464052287581, "acc_stderr": 0.026256053835718964, "acc_norm": 0.6993464052287581, "acc_norm_stderr": 0.026256053835718964 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.707395498392283, "acc_stderr": 0.025839898334877983, "acc_norm": 0.707395498392283, "acc_norm_stderr": 0.025839898334877983 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.7191358024691358, "acc_stderr": 0.025006469755799208, "acc_norm": 0.7191358024691358, "acc_norm_stderr": 0.025006469755799208 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.4326241134751773, "acc_stderr": 0.02955545423677885, "acc_norm": 0.4326241134751773, "acc_norm_stderr": 0.02955545423677885 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.4634941329856584, "acc_stderr": 0.012736153390214963, "acc_norm": 0.4634941329856584, "acc_norm_stderr": 0.012736153390214963 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.6838235294117647, "acc_stderr": 0.028245687391462916, "acc_norm": 0.6838235294117647, "acc_norm_stderr": 0.028245687391462916 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6552287581699346, "acc_stderr": 0.019228322018696647, "acc_norm": 0.6552287581699346, "acc_norm_stderr": 0.019228322018696647 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6363636363636364, "acc_stderr": 0.04607582090719976, "acc_norm": 0.6363636363636364, "acc_norm_stderr": 0.04607582090719976 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.710204081632653, "acc_stderr": 0.029043088683304328, "acc_norm": 0.710204081632653, "acc_norm_stderr": 0.029043088683304328 }, "harness|hendrycksTest-sociology|5": { "acc": 0.8159203980099502, "acc_stderr": 0.027403859410786848, "acc_norm": 0.8159203980099502, "acc_norm_stderr": 0.027403859410786848 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.85, "acc_stderr": 0.035887028128263686, "acc_norm": 0.85, "acc_norm_stderr": 0.035887028128263686 }, "harness|hendrycksTest-virology|5": { "acc": 0.5301204819277109, "acc_stderr": 0.03885425420866766, "acc_norm": 0.5301204819277109, "acc_norm_stderr": 0.03885425420866766 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8070175438596491, "acc_stderr": 0.030267457554898458, "acc_norm": 0.8070175438596491, "acc_norm_stderr": 0.030267457554898458 }, "harness|truthfulqa:mc|0": { "mc1": 0.2876376988984088, "mc1_stderr": 0.01584631510139481, "mc2": 0.4436037082395254, "mc2_stderr": 0.015171870706558463 }, "harness|winogrande|5": { "acc": 0.7884767166535123, "acc_stderr": 0.011477747684223187 }, "harness|gsm8k|5": { "acc": 0.03411675511751327, "acc_stderr": 0.005000212600773276 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_Weyaxi__Newton-7B
[ "region:us" ]
2024-02-01T20:59:53+00:00
{"pretty_name": "Evaluation run of Weyaxi/Newton-7B", "dataset_summary": "Dataset automatically created during the evaluation run of model [Weyaxi/Newton-7B](https://huggingface.co/Weyaxi/Newton-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Weyaxi__Newton-7B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-01T20:57:35.185949](https://huggingface.co/datasets/open-llm-leaderboard/details_Weyaxi__Newton-7B/blob/main/results_2024-02-01T20-57-35.185949.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6202881703209492,\n \"acc_stderr\": 0.032138997104958926,\n \"acc_norm\": 0.6312377539040873,\n \"acc_norm_stderr\": 0.0329280849118904,\n \"mc1\": 0.2876376988984088,\n \"mc1_stderr\": 0.01584631510139481,\n \"mc2\": 0.4436037082395254,\n \"mc2_stderr\": 0.015171870706558463\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6023890784982935,\n \"acc_stderr\": 0.01430175222327954,\n \"acc_norm\": 0.6399317406143344,\n \"acc_norm_stderr\": 0.014027516814585188\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6266679944234216,\n \"acc_stderr\": 0.0048270065208028835,\n \"acc_norm\": 0.817167894841665,\n \"acc_norm_stderr\": 0.0038573886135331035\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252606,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252606\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5703703703703704,\n \"acc_stderr\": 0.04276349494376599,\n \"acc_norm\": 0.5703703703703704,\n \"acc_norm_stderr\": 0.04276349494376599\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6907894736842105,\n \"acc_stderr\": 0.037610708698674805,\n \"acc_norm\": 0.6907894736842105,\n \"acc_norm_stderr\": 0.037610708698674805\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\": 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6716981132075471,\n \"acc_stderr\": 0.02890159361241178,\n \"acc_norm\": 0.6716981132075471,\n \"acc_norm_stderr\": 0.02890159361241178\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.03621034121889507,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.03621034121889507\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.43,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6647398843930635,\n \"acc_stderr\": 0.03599586301247077,\n \"acc_norm\": 0.6647398843930635,\n \"acc_norm_stderr\": 0.03599586301247077\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4019607843137255,\n \"acc_stderr\": 0.04878608714466996,\n \"acc_norm\": 0.4019607843137255,\n \"acc_norm_stderr\": 0.04878608714466996\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.548936170212766,\n \"acc_stderr\": 0.03252909619613197,\n \"acc_norm\": 0.548936170212766,\n \"acc_norm_stderr\": 0.03252909619613197\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.49122807017543857,\n \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.49122807017543857,\n \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5724137931034483,\n \"acc_stderr\": 0.041227371113703316,\n \"acc_norm\": 0.5724137931034483,\n \"acc_norm_stderr\": 0.041227371113703316\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.4126984126984127,\n \"acc_stderr\": 0.02535574126305527,\n \"acc_norm\": 0.4126984126984127,\n \"acc_norm_stderr\": 0.02535574126305527\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.47619047619047616,\n \"acc_stderr\": 0.04467062628403273,\n \"acc_norm\": 0.47619047619047616,\n \"acc_norm_stderr\": 0.04467062628403273\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7709677419354839,\n \"acc_stderr\": 0.023904914311782655,\n \"acc_norm\": 0.7709677419354839,\n \"acc_norm_stderr\": 0.023904914311782655\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.035158955511656986,\n \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.035158955511656986\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.032568666616811015,\n \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.032568666616811015\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.797979797979798,\n \"acc_stderr\": 0.02860620428922987,\n \"acc_norm\": 0.797979797979798,\n \"acc_norm_stderr\": 0.02860620428922987\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8808290155440415,\n \"acc_stderr\": 0.023381935348121417,\n \"acc_norm\": 0.8808290155440415,\n \"acc_norm_stderr\": 0.023381935348121417\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6230769230769231,\n \"acc_stderr\": 0.024570975364225995,\n \"acc_norm\": 0.6230769230769231,\n \"acc_norm_stderr\": 0.024570975364225995\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3296296296296296,\n \"acc_stderr\": 0.028661201116524586,\n \"acc_norm\": 0.3296296296296296,\n \"acc_norm_stderr\": 0.028661201116524586\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6386554621848739,\n \"acc_stderr\": 0.031204691225150016,\n \"acc_norm\": 0.6386554621848739,\n \"acc_norm_stderr\": 0.031204691225150016\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.31788079470198677,\n \"acc_stderr\": 0.038020397601079024,\n \"acc_norm\": 0.31788079470198677,\n \"acc_norm_stderr\": 0.038020397601079024\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8440366972477065,\n \"acc_stderr\": 0.015555802713590172,\n \"acc_norm\": 0.8440366972477065,\n \"acc_norm_stderr\": 0.015555802713590172\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4675925925925926,\n \"acc_stderr\": 0.03402801581358966,\n \"acc_norm\": 0.4675925925925926,\n \"acc_norm_stderr\": 0.03402801581358966\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8088235294117647,\n \"acc_stderr\": 0.027599174300640766,\n \"acc_norm\": 0.8088235294117647,\n \"acc_norm_stderr\": 0.027599174300640766\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7763713080168776,\n \"acc_stderr\": 0.027123298205229966,\n \"acc_norm\": 0.7763713080168776,\n \"acc_norm_stderr\": 0.027123298205229966\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7130044843049327,\n \"acc_stderr\": 0.030360379710291947,\n \"acc_norm\": 0.7130044843049327,\n \"acc_norm_stderr\": 0.030360379710291947\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7786259541984732,\n \"acc_stderr\": 0.03641297081313728,\n \"acc_norm\": 0.7786259541984732,\n \"acc_norm_stderr\": 0.03641297081313728\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7933884297520661,\n \"acc_stderr\": 0.03695980128098824,\n \"acc_norm\": 0.7933884297520661,\n \"acc_norm_stderr\": 0.03695980128098824\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04186091791394607,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04186091791394607\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7361963190184049,\n \"acc_stderr\": 0.03462419931615623,\n \"acc_norm\": 0.7361963190184049,\n \"acc_norm_stderr\": 0.03462419931615623\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3392857142857143,\n \"acc_stderr\": 0.04493949068613539,\n \"acc_norm\": 0.3392857142857143,\n \"acc_norm_stderr\": 0.04493949068613539\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8974358974358975,\n \"acc_stderr\": 0.01987565502786744,\n \"acc_norm\": 0.8974358974358975,\n \"acc_norm_stderr\": 0.01987565502786744\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8237547892720306,\n \"acc_stderr\": 0.013625556907993457,\n \"acc_norm\": 0.8237547892720306,\n \"acc_norm_stderr\": 0.013625556907993457\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7283236994219653,\n \"acc_stderr\": 0.023948512905468365,\n \"acc_norm\": 0.7283236994219653,\n \"acc_norm_stderr\": 0.023948512905468365\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.30837988826815643,\n \"acc_stderr\": 0.015445716910998884,\n \"acc_norm\": 0.30837988826815643,\n \"acc_norm_stderr\": 0.015445716910998884\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6993464052287581,\n \"acc_stderr\": 0.026256053835718964,\n \"acc_norm\": 0.6993464052287581,\n \"acc_norm_stderr\": 0.026256053835718964\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.707395498392283,\n \"acc_stderr\": 0.025839898334877983,\n \"acc_norm\": 0.707395498392283,\n \"acc_norm_stderr\": 0.025839898334877983\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7191358024691358,\n \"acc_stderr\": 0.025006469755799208,\n \"acc_norm\": 0.7191358024691358,\n \"acc_norm_stderr\": 0.025006469755799208\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4326241134751773,\n \"acc_stderr\": 0.02955545423677885,\n \"acc_norm\": 0.4326241134751773,\n \"acc_norm_stderr\": 0.02955545423677885\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4634941329856584,\n \"acc_stderr\": 0.012736153390214963,\n \"acc_norm\": 0.4634941329856584,\n \"acc_norm_stderr\": 0.012736153390214963\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6838235294117647,\n \"acc_stderr\": 0.028245687391462916,\n \"acc_norm\": 0.6838235294117647,\n \"acc_norm_stderr\": 0.028245687391462916\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6552287581699346,\n \"acc_stderr\": 0.019228322018696647,\n \"acc_norm\": 0.6552287581699346,\n \"acc_norm_stderr\": 0.019228322018696647\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6363636363636364,\n \"acc_stderr\": 0.04607582090719976,\n \"acc_norm\": 0.6363636363636364,\n \"acc_norm_stderr\": 0.04607582090719976\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.710204081632653,\n \"acc_stderr\": 0.029043088683304328,\n \"acc_norm\": 0.710204081632653,\n \"acc_norm_stderr\": 0.029043088683304328\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8159203980099502,\n \"acc_stderr\": 0.027403859410786848,\n \"acc_norm\": 0.8159203980099502,\n \"acc_norm_stderr\": 0.027403859410786848\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.85,\n \"acc_stderr\": 0.035887028128263686,\n \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.035887028128263686\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5301204819277109,\n \"acc_stderr\": 0.03885425420866766,\n \"acc_norm\": 0.5301204819277109,\n \"acc_norm_stderr\": 0.03885425420866766\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8070175438596491,\n \"acc_stderr\": 0.030267457554898458,\n \"acc_norm\": 0.8070175438596491,\n \"acc_norm_stderr\": 0.030267457554898458\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2876376988984088,\n \"mc1_stderr\": 0.01584631510139481,\n \"mc2\": 0.4436037082395254,\n \"mc2_stderr\": 0.015171870706558463\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7884767166535123,\n \"acc_stderr\": 0.011477747684223187\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.03411675511751327,\n \"acc_stderr\": 0.005000212600773276\n }\n}\n```", "repo_url": "https://huggingface.co/Weyaxi/Newton-7B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_01T20_57_35.185949", "path": ["**/details_harness|arc:challenge|25_2024-02-01T20-57-35.185949.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-01T20-57-35.185949.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_01T20_57_35.185949", "path": ["**/details_harness|gsm8k|5_2024-02-01T20-57-35.185949.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-01T20-57-35.185949.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_01T20_57_35.185949", "path": ["**/details_harness|hellaswag|10_2024-02-01T20-57-35.185949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-01T20-57-35.185949.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_01T20_57_35.185949", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T20-57-35.185949.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-01T20-57-35.185949.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-01T20-57-35.185949.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T20-57-35.185949.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T20-57-35.185949.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-01T20-57-35.185949.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T20-57-35.185949.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T20-57-35.185949.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T20-57-35.185949.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T20-57-35.185949.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-01T20-57-35.185949.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-01T20-57-35.185949.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T20-57-35.185949.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-01T20-57-35.185949.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T20-57-35.185949.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T20-57-35.185949.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T20-57-35.185949.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-01T20-57-35.185949.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T20-57-35.185949.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T20-57-35.185949.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T20-57-35.185949.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T20-57-35.185949.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T20-57-35.185949.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T20-57-35.185949.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T20-57-35.185949.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T20-57-35.185949.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T20-57-35.185949.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T20-57-35.185949.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T20-57-35.185949.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T20-57-35.185949.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T20-57-35.185949.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T20-57-35.185949.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-01T20-57-35.185949.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T20-57-35.185949.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-01T20-57-35.185949.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T20-57-35.185949.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T20-57-35.185949.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T20-57-35.185949.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-01T20-57-35.185949.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-01T20-57-35.185949.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T20-57-35.185949.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T20-57-35.185949.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T20-57-35.185949.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T20-57-35.185949.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-01T20-57-35.185949.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-01T20-57-35.185949.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-01T20-57-35.185949.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T20-57-35.185949.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-01T20-57-35.185949.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T20-57-35.185949.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T20-57-35.185949.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-01T20-57-35.185949.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-01T20-57-35.185949.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-01T20-57-35.185949.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T20-57-35.185949.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-01T20-57-35.185949.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-01T20-57-35.185949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T20-57-35.185949.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-01T20-57-35.185949.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-01T20-57-35.185949.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T20-57-35.185949.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T20-57-35.185949.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-01T20-57-35.185949.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T20-57-35.185949.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T20-57-35.185949.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T20-57-35.185949.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T20-57-35.185949.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-01T20-57-35.185949.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-01T20-57-35.185949.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T20-57-35.185949.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-01T20-57-35.185949.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T20-57-35.185949.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T20-57-35.185949.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T20-57-35.185949.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-01T20-57-35.185949.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T20-57-35.185949.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T20-57-35.185949.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T20-57-35.185949.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T20-57-35.185949.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T20-57-35.185949.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T20-57-35.185949.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T20-57-35.185949.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T20-57-35.185949.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T20-57-35.185949.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T20-57-35.185949.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T20-57-35.185949.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T20-57-35.185949.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T20-57-35.185949.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T20-57-35.185949.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-01T20-57-35.185949.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T20-57-35.185949.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-01T20-57-35.185949.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T20-57-35.185949.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T20-57-35.185949.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T20-57-35.185949.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-01T20-57-35.185949.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-01T20-57-35.185949.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T20-57-35.185949.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T20-57-35.185949.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T20-57-35.185949.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T20-57-35.185949.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-01T20-57-35.185949.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-01T20-57-35.185949.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-01T20-57-35.185949.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T20-57-35.185949.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-01T20-57-35.185949.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T20-57-35.185949.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T20-57-35.185949.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-01T20-57-35.185949.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-01T20-57-35.185949.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-01T20-57-35.185949.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T20-57-35.185949.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-01T20-57-35.185949.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-01T20-57-35.185949.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_01T20_57_35.185949", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T20-57-35.185949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T20-57-35.185949.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_01T20_57_35.185949", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-01T20-57-35.185949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-01T20-57-35.185949.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_01T20_57_35.185949", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-01T20-57-35.185949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-01T20-57-35.185949.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_01T20_57_35.185949", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T20-57-35.185949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T20-57-35.185949.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_01T20_57_35.185949", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T20-57-35.185949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T20-57-35.185949.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_01T20_57_35.185949", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-01T20-57-35.185949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-01T20-57-35.185949.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_01T20_57_35.185949", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T20-57-35.185949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T20-57-35.185949.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_01T20_57_35.185949", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T20-57-35.185949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T20-57-35.185949.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_01T20_57_35.185949", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T20-57-35.185949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T20-57-35.185949.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_01T20_57_35.185949", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T20-57-35.185949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T20-57-35.185949.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_01T20_57_35.185949", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-01T20-57-35.185949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-01T20-57-35.185949.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_01T20_57_35.185949", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-01T20-57-35.185949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-01T20-57-35.185949.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_01T20_57_35.185949", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T20-57-35.185949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T20-57-35.185949.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_01T20_57_35.185949", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-01T20-57-35.185949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-01T20-57-35.185949.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_01T20_57_35.185949", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T20-57-35.185949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T20-57-35.185949.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_01T20_57_35.185949", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T20-57-35.185949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T20-57-35.185949.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_01T20_57_35.185949", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T20-57-35.185949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T20-57-35.185949.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_01T20_57_35.185949", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-01T20-57-35.185949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-01T20-57-35.185949.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_01T20_57_35.185949", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T20-57-35.185949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T20-57-35.185949.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_01T20_57_35.185949", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T20-57-35.185949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T20-57-35.185949.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_01T20_57_35.185949", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T20-57-35.185949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T20-57-35.185949.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_01T20_57_35.185949", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T20-57-35.185949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T20-57-35.185949.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_01T20_57_35.185949", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T20-57-35.185949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T20-57-35.185949.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_01T20_57_35.185949", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T20-57-35.185949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T20-57-35.185949.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_01T20_57_35.185949", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T20-57-35.185949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T20-57-35.185949.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_01T20_57_35.185949", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T20-57-35.185949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T20-57-35.185949.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_01T20_57_35.185949", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T20-57-35.185949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T20-57-35.185949.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_01T20_57_35.185949", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T20-57-35.185949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T20-57-35.185949.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_01T20_57_35.185949", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T20-57-35.185949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T20-57-35.185949.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_01T20_57_35.185949", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T20-57-35.185949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T20-57-35.185949.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_01T20_57_35.185949", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T20-57-35.185949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T20-57-35.185949.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_01T20_57_35.185949", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T20-57-35.185949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T20-57-35.185949.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_01T20_57_35.185949", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-01T20-57-35.185949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-01T20-57-35.185949.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_01T20_57_35.185949", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T20-57-35.185949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T20-57-35.185949.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_01T20_57_35.185949", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-01T20-57-35.185949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-01T20-57-35.185949.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_01T20_57_35.185949", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T20-57-35.185949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T20-57-35.185949.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_01T20_57_35.185949", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T20-57-35.185949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T20-57-35.185949.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_01T20_57_35.185949", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T20-57-35.185949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T20-57-35.185949.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_01T20_57_35.185949", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-01T20-57-35.185949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-01T20-57-35.185949.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_01T20_57_35.185949", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-01T20-57-35.185949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-01T20-57-35.185949.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_01T20_57_35.185949", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T20-57-35.185949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T20-57-35.185949.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_01T20_57_35.185949", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T20-57-35.185949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T20-57-35.185949.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_01T20_57_35.185949", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T20-57-35.185949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T20-57-35.185949.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_01T20_57_35.185949", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T20-57-35.185949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T20-57-35.185949.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_01T20_57_35.185949", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-01T20-57-35.185949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-01T20-57-35.185949.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_01T20_57_35.185949", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-01T20-57-35.185949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-01T20-57-35.185949.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_01T20_57_35.185949", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-01T20-57-35.185949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-01T20-57-35.185949.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_01T20_57_35.185949", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T20-57-35.185949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T20-57-35.185949.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_01T20_57_35.185949", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-01T20-57-35.185949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-01T20-57-35.185949.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_01T20_57_35.185949", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T20-57-35.185949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T20-57-35.185949.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_01T20_57_35.185949", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T20-57-35.185949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T20-57-35.185949.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_01T20_57_35.185949", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-01T20-57-35.185949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-01T20-57-35.185949.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_01T20_57_35.185949", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-01T20-57-35.185949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-01T20-57-35.185949.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_01T20_57_35.185949", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-01T20-57-35.185949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-01T20-57-35.185949.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_01T20_57_35.185949", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T20-57-35.185949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T20-57-35.185949.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_01T20_57_35.185949", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-01T20-57-35.185949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-01T20-57-35.185949.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_01T20_57_35.185949", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-01T20-57-35.185949.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-01T20-57-35.185949.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_01T20_57_35.185949", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-01T20-57-35.185949.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-01T20-57-35.185949.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_01T20_57_35.185949", "path": ["**/details_harness|winogrande|5_2024-02-01T20-57-35.185949.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-01T20-57-35.185949.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_01T20_57_35.185949", "path": ["results_2024-02-01T20-57-35.185949.parquet"]}, {"split": "latest", "path": ["results_2024-02-01T20-57-35.185949.parquet"]}]}]}
2024-02-01T21:00:20+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of Weyaxi/Newton-7B Dataset automatically created during the evaluation run of model Weyaxi/Newton-7B on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-02-01T20:57:35.185949(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of Weyaxi/Newton-7B\n\n\n\nDataset automatically created during the evaluation run of model Weyaxi/Newton-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-02-01T20:57:35.185949(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of Weyaxi/Newton-7B\n\n\n\nDataset automatically created during the evaluation run of model Weyaxi/Newton-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-02-01T20:57:35.185949(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
47718b25113157791b887a989fc102bd4ab33d9c
# Dataset Card for Evaluation run of fhai50032/RolePlayLake-7B <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [fhai50032/RolePlayLake-7B](https://huggingface.co/fhai50032/RolePlayLake-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_fhai50032__RolePlayLake-7B", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-02-01T21:00:40.724978](https://huggingface.co/datasets/open-llm-leaderboard/details_fhai50032__RolePlayLake-7B/blob/main/results_2024-02-01T21-00-40.724978.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6499703955983278, "acc_stderr": 0.03214093071438826, "acc_norm": 0.6504259317594266, "acc_norm_stderr": 0.03280141576561222, "mc1": 0.4847001223990208, "mc1_stderr": 0.0174953044731879, "mc2": 0.6437979342958777, "mc2_stderr": 0.015378685729976286 }, "harness|arc:challenge|25": { "acc": 0.6697952218430034, "acc_stderr": 0.013743085603760424, "acc_norm": 0.7056313993174061, "acc_norm_stderr": 0.013318528460539419 }, "harness|hellaswag|10": { "acc": 0.6999601672973511, "acc_stderr": 0.004573383672159084, "acc_norm": 0.874228241386178, "acc_norm_stderr": 0.003309142727351082 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.33, "acc_stderr": 0.047258156262526045, "acc_norm": 0.33, "acc_norm_stderr": 0.047258156262526045 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.6148148148148148, "acc_stderr": 0.04203921040156279, "acc_norm": 0.6148148148148148, "acc_norm_stderr": 0.04203921040156279 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.6776315789473685, "acc_stderr": 0.03803510248351585, "acc_norm": 0.6776315789473685, "acc_norm_stderr": 0.03803510248351585 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.61, "acc_stderr": 0.04902071300001975, "acc_norm": 0.61, "acc_norm_stderr": 0.04902071300001975 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.7132075471698113, "acc_stderr": 0.027834912527544064, "acc_norm": 0.7132075471698113, "acc_norm_stderr": 0.027834912527544064 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.7430555555555556, "acc_stderr": 0.03653946969442099, "acc_norm": 0.7430555555555556, "acc_norm_stderr": 0.03653946969442099 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.47, "acc_stderr": 0.05016135580465919, "acc_norm": 0.47, "acc_norm_stderr": 0.05016135580465919 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.51, "acc_stderr": 0.05024183937956912, "acc_norm": 0.51, "acc_norm_stderr": 0.05024183937956912 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.32, "acc_stderr": 0.04688261722621504, "acc_norm": 0.32, "acc_norm_stderr": 0.04688261722621504 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6820809248554913, "acc_stderr": 0.0355068398916558, "acc_norm": 0.6820809248554913, "acc_norm_stderr": 0.0355068398916558 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.45098039215686275, "acc_stderr": 0.04951218252396262, "acc_norm": 0.45098039215686275, "acc_norm_stderr": 0.04951218252396262 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.77, "acc_stderr": 0.04229525846816508, "acc_norm": 0.77, "acc_norm_stderr": 0.04229525846816508 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.5872340425531914, "acc_stderr": 0.03218471141400351, "acc_norm": 0.5872340425531914, "acc_norm_stderr": 0.03218471141400351 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.5, "acc_stderr": 0.047036043419179864, "acc_norm": 0.5, "acc_norm_stderr": 0.047036043419179864 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5862068965517241, "acc_stderr": 0.04104269211806232, "acc_norm": 0.5862068965517241, "acc_norm_stderr": 0.04104269211806232 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.3941798941798942, "acc_stderr": 0.02516798233389414, "acc_norm": 0.3941798941798942, "acc_norm_stderr": 0.02516798233389414 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.47619047619047616, "acc_stderr": 0.04467062628403273, "acc_norm": 0.47619047619047616, "acc_norm_stderr": 0.04467062628403273 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.33, "acc_stderr": 0.04725815626252605, "acc_norm": 0.33, "acc_norm_stderr": 0.04725815626252605 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.7709677419354839, "acc_stderr": 0.023904914311782648, "acc_norm": 0.7709677419354839, "acc_norm_stderr": 0.023904914311782648 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.4876847290640394, "acc_stderr": 0.035169204442208966, "acc_norm": 0.4876847290640394, "acc_norm_stderr": 0.035169204442208966 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.7, "acc_stderr": 0.046056618647183814, "acc_norm": 0.7, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7757575757575758, "acc_stderr": 0.03256866661681102, "acc_norm": 0.7757575757575758, "acc_norm_stderr": 0.03256866661681102 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.7828282828282829, "acc_stderr": 0.02937661648494563, "acc_norm": 0.7828282828282829, "acc_norm_stderr": 0.02937661648494563 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.8963730569948186, "acc_stderr": 0.02199531196364424, "acc_norm": 0.8963730569948186, "acc_norm_stderr": 0.02199531196364424 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.6615384615384615, "acc_stderr": 0.023991500500313036, "acc_norm": 0.6615384615384615, "acc_norm_stderr": 0.023991500500313036 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.31851851851851853, "acc_stderr": 0.02840653309060846, "acc_norm": 0.31851851851851853, "acc_norm_stderr": 0.02840653309060846 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.6890756302521008, "acc_stderr": 0.030066761582977938, "acc_norm": 0.6890756302521008, "acc_norm_stderr": 0.030066761582977938 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.3576158940397351, "acc_stderr": 0.03913453431177258, "acc_norm": 0.3576158940397351, "acc_norm_stderr": 0.03913453431177258 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8422018348623853, "acc_stderr": 0.015630022970092448, "acc_norm": 0.8422018348623853, "acc_norm_stderr": 0.015630022970092448 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.49537037037037035, "acc_stderr": 0.03409825519163572, "acc_norm": 0.49537037037037035, "acc_norm_stderr": 0.03409825519163572 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.8284313725490197, "acc_stderr": 0.026460569561240644, "acc_norm": 0.8284313725490197, "acc_norm_stderr": 0.026460569561240644 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.8059071729957806, "acc_stderr": 0.02574490253229092, "acc_norm": 0.8059071729957806, "acc_norm_stderr": 0.02574490253229092 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6995515695067265, "acc_stderr": 0.030769352008229146, "acc_norm": 0.6995515695067265, "acc_norm_stderr": 0.030769352008229146 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.7786259541984732, "acc_stderr": 0.036412970813137276, "acc_norm": 0.7786259541984732, "acc_norm_stderr": 0.036412970813137276 }, "harness|hendrycksTest-international_law|5": { "acc": 0.8016528925619835, "acc_stderr": 0.03640118271990946, "acc_norm": 0.8016528925619835, "acc_norm_stderr": 0.03640118271990946 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7870370370370371, "acc_stderr": 0.0395783547198098, "acc_norm": 0.7870370370370371, "acc_norm_stderr": 0.0395783547198098 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7607361963190185, "acc_stderr": 0.033519538795212696, "acc_norm": 0.7607361963190185, "acc_norm_stderr": 0.033519538795212696 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.48214285714285715, "acc_stderr": 0.047427623612430116, "acc_norm": 0.48214285714285715, "acc_norm_stderr": 0.047427623612430116 }, "harness|hendrycksTest-management|5": { "acc": 0.7864077669902912, "acc_stderr": 0.040580420156460344, "acc_norm": 0.7864077669902912, "acc_norm_stderr": 0.040580420156460344 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8632478632478633, "acc_stderr": 0.0225090339370778, "acc_norm": 0.8632478632478633, "acc_norm_stderr": 0.0225090339370778 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.73, "acc_stderr": 0.044619604333847394, "acc_norm": 0.73, "acc_norm_stderr": 0.044619604333847394 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8275862068965517, "acc_stderr": 0.013507943909371802, "acc_norm": 0.8275862068965517, "acc_norm_stderr": 0.013507943909371802 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.7312138728323699, "acc_stderr": 0.023868003262500097, "acc_norm": 0.7312138728323699, "acc_norm_stderr": 0.023868003262500097 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.41787709497206704, "acc_stderr": 0.016495400635820084, "acc_norm": 0.41787709497206704, "acc_norm_stderr": 0.016495400635820084 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.7091503267973857, "acc_stderr": 0.02600480036395213, "acc_norm": 0.7091503267973857, "acc_norm_stderr": 0.02600480036395213 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.7234726688102894, "acc_stderr": 0.02540383297817961, "acc_norm": 0.7234726688102894, "acc_norm_stderr": 0.02540383297817961 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.7345679012345679, "acc_stderr": 0.024569223600460845, "acc_norm": 0.7345679012345679, "acc_norm_stderr": 0.024569223600460845 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.5, "acc_stderr": 0.029827499313594685, "acc_norm": 0.5, "acc_norm_stderr": 0.029827499313594685 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.46088657105606257, "acc_stderr": 0.012731102790504515, "acc_norm": 0.46088657105606257, "acc_norm_stderr": 0.012731102790504515 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.6911764705882353, "acc_stderr": 0.02806499816704009, "acc_norm": 0.6911764705882353, "acc_norm_stderr": 0.02806499816704009 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6552287581699346, "acc_stderr": 0.01922832201869664, "acc_norm": 0.6552287581699346, "acc_norm_stderr": 0.01922832201869664 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6636363636363637, "acc_stderr": 0.04525393596302506, "acc_norm": 0.6636363636363637, "acc_norm_stderr": 0.04525393596302506 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.7346938775510204, "acc_stderr": 0.028263889943784596, "acc_norm": 0.7346938775510204, "acc_norm_stderr": 0.028263889943784596 }, "harness|hendrycksTest-sociology|5": { "acc": 0.845771144278607, "acc_stderr": 0.025538433368578327, "acc_norm": 0.845771144278607, "acc_norm_stderr": 0.025538433368578327 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.86, "acc_stderr": 0.0348735088019777, "acc_norm": 0.86, "acc_norm_stderr": 0.0348735088019777 }, "harness|hendrycksTest-virology|5": { "acc": 0.5481927710843374, "acc_stderr": 0.03874371556587953, "acc_norm": 0.5481927710843374, "acc_norm_stderr": 0.03874371556587953 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8245614035087719, "acc_stderr": 0.029170885500727668, "acc_norm": 0.8245614035087719, "acc_norm_stderr": 0.029170885500727668 }, "harness|truthfulqa:mc|0": { "mc1": 0.4847001223990208, "mc1_stderr": 0.0174953044731879, "mc2": 0.6437979342958777, "mc2_stderr": 0.015378685729976286 }, "harness|winogrande|5": { "acc": 0.8326756116811366, "acc_stderr": 0.010490608806828075 }, "harness|gsm8k|5": { "acc": 0.6504927975739196, "acc_stderr": 0.013133836511705993 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_fhai50032__RolePlayLake-7B
[ "region:us" ]
2024-02-01T21:03:02+00:00
{"pretty_name": "Evaluation run of fhai50032/RolePlayLake-7B", "dataset_summary": "Dataset automatically created during the evaluation run of model [fhai50032/RolePlayLake-7B](https://huggingface.co/fhai50032/RolePlayLake-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_fhai50032__RolePlayLake-7B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-01T21:00:40.724978](https://huggingface.co/datasets/open-llm-leaderboard/details_fhai50032__RolePlayLake-7B/blob/main/results_2024-02-01T21-00-40.724978.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6499703955983278,\n \"acc_stderr\": 0.03214093071438826,\n \"acc_norm\": 0.6504259317594266,\n \"acc_norm_stderr\": 0.03280141576561222,\n \"mc1\": 0.4847001223990208,\n \"mc1_stderr\": 0.0174953044731879,\n \"mc2\": 0.6437979342958777,\n \"mc2_stderr\": 0.015378685729976286\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6697952218430034,\n \"acc_stderr\": 0.013743085603760424,\n \"acc_norm\": 0.7056313993174061,\n \"acc_norm_stderr\": 0.013318528460539419\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6999601672973511,\n \"acc_stderr\": 0.004573383672159084,\n \"acc_norm\": 0.874228241386178,\n \"acc_norm_stderr\": 0.003309142727351082\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6148148148148148,\n \"acc_stderr\": 0.04203921040156279,\n \"acc_norm\": 0.6148148148148148,\n \"acc_norm_stderr\": 0.04203921040156279\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6776315789473685,\n \"acc_stderr\": 0.03803510248351585,\n \"acc_norm\": 0.6776315789473685,\n \"acc_norm_stderr\": 0.03803510248351585\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.61,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7132075471698113,\n \"acc_stderr\": 0.027834912527544064,\n \"acc_norm\": 0.7132075471698113,\n \"acc_norm_stderr\": 0.027834912527544064\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7430555555555556,\n \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.7430555555555556,\n \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6820809248554913,\n \"acc_stderr\": 0.0355068398916558,\n \"acc_norm\": 0.6820809248554913,\n \"acc_norm_stderr\": 0.0355068398916558\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.45098039215686275,\n \"acc_stderr\": 0.04951218252396262,\n \"acc_norm\": 0.45098039215686275,\n \"acc_norm_stderr\": 0.04951218252396262\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.77,\n \"acc_stderr\": 0.04229525846816508,\n \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.04229525846816508\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5872340425531914,\n \"acc_stderr\": 0.03218471141400351,\n \"acc_norm\": 0.5872340425531914,\n \"acc_norm_stderr\": 0.03218471141400351\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.047036043419179864,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.047036043419179864\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5862068965517241,\n \"acc_stderr\": 0.04104269211806232,\n \"acc_norm\": 0.5862068965517241,\n \"acc_norm_stderr\": 0.04104269211806232\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3941798941798942,\n \"acc_stderr\": 0.02516798233389414,\n \"acc_norm\": 0.3941798941798942,\n \"acc_norm_stderr\": 0.02516798233389414\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.47619047619047616,\n \"acc_stderr\": 0.04467062628403273,\n \"acc_norm\": 0.47619047619047616,\n \"acc_norm_stderr\": 0.04467062628403273\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252605,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252605\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7709677419354839,\n \"acc_stderr\": 0.023904914311782648,\n \"acc_norm\": 0.7709677419354839,\n \"acc_norm_stderr\": 0.023904914311782648\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4876847290640394,\n \"acc_stderr\": 0.035169204442208966,\n \"acc_norm\": 0.4876847290640394,\n \"acc_norm_stderr\": 0.035169204442208966\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.03256866661681102,\n \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.03256866661681102\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7828282828282829,\n \"acc_stderr\": 0.02937661648494563,\n \"acc_norm\": 0.7828282828282829,\n \"acc_norm_stderr\": 0.02937661648494563\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8963730569948186,\n \"acc_stderr\": 0.02199531196364424,\n \"acc_norm\": 0.8963730569948186,\n \"acc_norm_stderr\": 0.02199531196364424\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6615384615384615,\n \"acc_stderr\": 0.023991500500313036,\n \"acc_norm\": 0.6615384615384615,\n \"acc_norm_stderr\": 0.023991500500313036\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.31851851851851853,\n \"acc_stderr\": 0.02840653309060846,\n \"acc_norm\": 0.31851851851851853,\n \"acc_norm_stderr\": 0.02840653309060846\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6890756302521008,\n \"acc_stderr\": 0.030066761582977938,\n \"acc_norm\": 0.6890756302521008,\n \"acc_norm_stderr\": 0.030066761582977938\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3576158940397351,\n \"acc_stderr\": 0.03913453431177258,\n \"acc_norm\": 0.3576158940397351,\n \"acc_norm_stderr\": 0.03913453431177258\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8422018348623853,\n \"acc_stderr\": 0.015630022970092448,\n \"acc_norm\": 0.8422018348623853,\n \"acc_norm_stderr\": 0.015630022970092448\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.49537037037037035,\n \"acc_stderr\": 0.03409825519163572,\n \"acc_norm\": 0.49537037037037035,\n \"acc_norm_stderr\": 0.03409825519163572\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8284313725490197,\n \"acc_stderr\": 0.026460569561240644,\n \"acc_norm\": 0.8284313725490197,\n \"acc_norm_stderr\": 0.026460569561240644\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8059071729957806,\n \"acc_stderr\": 0.02574490253229092,\n \"acc_norm\": 0.8059071729957806,\n \"acc_norm_stderr\": 0.02574490253229092\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6995515695067265,\n \"acc_stderr\": 0.030769352008229146,\n \"acc_norm\": 0.6995515695067265,\n \"acc_norm_stderr\": 0.030769352008229146\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7786259541984732,\n \"acc_stderr\": 0.036412970813137276,\n \"acc_norm\": 0.7786259541984732,\n \"acc_norm_stderr\": 0.036412970813137276\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8016528925619835,\n \"acc_stderr\": 0.03640118271990946,\n \"acc_norm\": 0.8016528925619835,\n \"acc_norm_stderr\": 0.03640118271990946\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n \"acc_stderr\": 0.0395783547198098,\n \"acc_norm\": 0.7870370370370371,\n \"acc_norm_stderr\": 0.0395783547198098\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7607361963190185,\n \"acc_stderr\": 0.033519538795212696,\n \"acc_norm\": 0.7607361963190185,\n \"acc_norm_stderr\": 0.033519538795212696\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.48214285714285715,\n \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.48214285714285715,\n \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7864077669902912,\n \"acc_stderr\": 0.040580420156460344,\n \"acc_norm\": 0.7864077669902912,\n \"acc_norm_stderr\": 0.040580420156460344\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8632478632478633,\n \"acc_stderr\": 0.0225090339370778,\n \"acc_norm\": 0.8632478632478633,\n \"acc_norm_stderr\": 0.0225090339370778\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8275862068965517,\n \"acc_stderr\": 0.013507943909371802,\n \"acc_norm\": 0.8275862068965517,\n \"acc_norm_stderr\": 0.013507943909371802\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7312138728323699,\n \"acc_stderr\": 0.023868003262500097,\n \"acc_norm\": 0.7312138728323699,\n \"acc_norm_stderr\": 0.023868003262500097\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.41787709497206704,\n \"acc_stderr\": 0.016495400635820084,\n \"acc_norm\": 0.41787709497206704,\n \"acc_norm_stderr\": 0.016495400635820084\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7091503267973857,\n \"acc_stderr\": 0.02600480036395213,\n \"acc_norm\": 0.7091503267973857,\n \"acc_norm_stderr\": 0.02600480036395213\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7234726688102894,\n \"acc_stderr\": 0.02540383297817961,\n \"acc_norm\": 0.7234726688102894,\n \"acc_norm_stderr\": 0.02540383297817961\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7345679012345679,\n \"acc_stderr\": 0.024569223600460845,\n \"acc_norm\": 0.7345679012345679,\n \"acc_norm_stderr\": 0.024569223600460845\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.029827499313594685,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.029827499313594685\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46088657105606257,\n \"acc_stderr\": 0.012731102790504515,\n \"acc_norm\": 0.46088657105606257,\n \"acc_norm_stderr\": 0.012731102790504515\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6911764705882353,\n \"acc_stderr\": 0.02806499816704009,\n \"acc_norm\": 0.6911764705882353,\n \"acc_norm_stderr\": 0.02806499816704009\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6552287581699346,\n \"acc_stderr\": 0.01922832201869664,\n \"acc_norm\": 0.6552287581699346,\n \"acc_norm_stderr\": 0.01922832201869664\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7346938775510204,\n \"acc_stderr\": 0.028263889943784596,\n \"acc_norm\": 0.7346938775510204,\n \"acc_norm_stderr\": 0.028263889943784596\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.845771144278607,\n \"acc_stderr\": 0.025538433368578327,\n \"acc_norm\": 0.845771144278607,\n \"acc_norm_stderr\": 0.025538433368578327\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.86,\n \"acc_stderr\": 0.0348735088019777,\n \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.0348735088019777\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5481927710843374,\n \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.5481927710843374,\n \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8245614035087719,\n \"acc_stderr\": 0.029170885500727668,\n \"acc_norm\": 0.8245614035087719,\n \"acc_norm_stderr\": 0.029170885500727668\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4847001223990208,\n \"mc1_stderr\": 0.0174953044731879,\n \"mc2\": 0.6437979342958777,\n \"mc2_stderr\": 0.015378685729976286\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8326756116811366,\n \"acc_stderr\": 0.010490608806828075\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6504927975739196,\n \"acc_stderr\": 0.013133836511705993\n }\n}\n```", "repo_url": "https://huggingface.co/fhai50032/RolePlayLake-7B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_01T21_00_40.724978", "path": ["**/details_harness|arc:challenge|25_2024-02-01T21-00-40.724978.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-01T21-00-40.724978.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_01T21_00_40.724978", "path": ["**/details_harness|gsm8k|5_2024-02-01T21-00-40.724978.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-01T21-00-40.724978.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_01T21_00_40.724978", "path": ["**/details_harness|hellaswag|10_2024-02-01T21-00-40.724978.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-01T21-00-40.724978.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_01T21_00_40.724978", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T21-00-40.724978.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-01T21-00-40.724978.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-01T21-00-40.724978.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T21-00-40.724978.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T21-00-40.724978.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-01T21-00-40.724978.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T21-00-40.724978.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T21-00-40.724978.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T21-00-40.724978.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T21-00-40.724978.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-01T21-00-40.724978.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-01T21-00-40.724978.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T21-00-40.724978.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-01T21-00-40.724978.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T21-00-40.724978.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T21-00-40.724978.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T21-00-40.724978.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-01T21-00-40.724978.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T21-00-40.724978.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T21-00-40.724978.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T21-00-40.724978.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T21-00-40.724978.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T21-00-40.724978.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T21-00-40.724978.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T21-00-40.724978.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T21-00-40.724978.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T21-00-40.724978.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T21-00-40.724978.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T21-00-40.724978.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T21-00-40.724978.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T21-00-40.724978.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T21-00-40.724978.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-01T21-00-40.724978.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T21-00-40.724978.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-01T21-00-40.724978.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T21-00-40.724978.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T21-00-40.724978.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T21-00-40.724978.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-01T21-00-40.724978.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-01T21-00-40.724978.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T21-00-40.724978.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T21-00-40.724978.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T21-00-40.724978.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T21-00-40.724978.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-01T21-00-40.724978.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-01T21-00-40.724978.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-01T21-00-40.724978.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T21-00-40.724978.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-01T21-00-40.724978.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T21-00-40.724978.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T21-00-40.724978.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-01T21-00-40.724978.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-01T21-00-40.724978.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-01T21-00-40.724978.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T21-00-40.724978.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-01T21-00-40.724978.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-01T21-00-40.724978.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T21-00-40.724978.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-01T21-00-40.724978.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-01T21-00-40.724978.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T21-00-40.724978.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T21-00-40.724978.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-01T21-00-40.724978.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T21-00-40.724978.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T21-00-40.724978.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T21-00-40.724978.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T21-00-40.724978.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-01T21-00-40.724978.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-01T21-00-40.724978.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T21-00-40.724978.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-01T21-00-40.724978.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T21-00-40.724978.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T21-00-40.724978.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T21-00-40.724978.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-01T21-00-40.724978.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T21-00-40.724978.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T21-00-40.724978.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T21-00-40.724978.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T21-00-40.724978.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T21-00-40.724978.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T21-00-40.724978.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T21-00-40.724978.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T21-00-40.724978.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T21-00-40.724978.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T21-00-40.724978.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T21-00-40.724978.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T21-00-40.724978.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T21-00-40.724978.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T21-00-40.724978.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-01T21-00-40.724978.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T21-00-40.724978.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-01T21-00-40.724978.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T21-00-40.724978.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T21-00-40.724978.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T21-00-40.724978.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-01T21-00-40.724978.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-01T21-00-40.724978.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T21-00-40.724978.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T21-00-40.724978.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T21-00-40.724978.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T21-00-40.724978.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-01T21-00-40.724978.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-01T21-00-40.724978.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-01T21-00-40.724978.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T21-00-40.724978.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-01T21-00-40.724978.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T21-00-40.724978.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T21-00-40.724978.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-01T21-00-40.724978.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-01T21-00-40.724978.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-01T21-00-40.724978.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T21-00-40.724978.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-01T21-00-40.724978.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-01T21-00-40.724978.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_01T21_00_40.724978", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T21-00-40.724978.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T21-00-40.724978.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_01T21_00_40.724978", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-01T21-00-40.724978.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-01T21-00-40.724978.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_01T21_00_40.724978", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-01T21-00-40.724978.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-01T21-00-40.724978.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_01T21_00_40.724978", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T21-00-40.724978.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T21-00-40.724978.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_01T21_00_40.724978", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T21-00-40.724978.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T21-00-40.724978.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_01T21_00_40.724978", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-01T21-00-40.724978.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-01T21-00-40.724978.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_01T21_00_40.724978", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T21-00-40.724978.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T21-00-40.724978.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_01T21_00_40.724978", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T21-00-40.724978.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T21-00-40.724978.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_01T21_00_40.724978", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T21-00-40.724978.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T21-00-40.724978.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_01T21_00_40.724978", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T21-00-40.724978.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T21-00-40.724978.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_01T21_00_40.724978", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-01T21-00-40.724978.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-01T21-00-40.724978.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_01T21_00_40.724978", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-01T21-00-40.724978.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-01T21-00-40.724978.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_01T21_00_40.724978", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T21-00-40.724978.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T21-00-40.724978.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_01T21_00_40.724978", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-01T21-00-40.724978.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-01T21-00-40.724978.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_01T21_00_40.724978", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T21-00-40.724978.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T21-00-40.724978.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_01T21_00_40.724978", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T21-00-40.724978.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T21-00-40.724978.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_01T21_00_40.724978", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T21-00-40.724978.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T21-00-40.724978.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_01T21_00_40.724978", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-01T21-00-40.724978.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-01T21-00-40.724978.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_01T21_00_40.724978", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T21-00-40.724978.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T21-00-40.724978.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_01T21_00_40.724978", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T21-00-40.724978.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T21-00-40.724978.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_01T21_00_40.724978", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T21-00-40.724978.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T21-00-40.724978.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_01T21_00_40.724978", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T21-00-40.724978.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T21-00-40.724978.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_01T21_00_40.724978", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T21-00-40.724978.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T21-00-40.724978.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_01T21_00_40.724978", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T21-00-40.724978.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T21-00-40.724978.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_01T21_00_40.724978", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T21-00-40.724978.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T21-00-40.724978.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_01T21_00_40.724978", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T21-00-40.724978.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T21-00-40.724978.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_01T21_00_40.724978", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T21-00-40.724978.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T21-00-40.724978.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_01T21_00_40.724978", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T21-00-40.724978.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T21-00-40.724978.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_01T21_00_40.724978", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T21-00-40.724978.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T21-00-40.724978.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_01T21_00_40.724978", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T21-00-40.724978.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T21-00-40.724978.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_01T21_00_40.724978", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T21-00-40.724978.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T21-00-40.724978.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_01T21_00_40.724978", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T21-00-40.724978.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T21-00-40.724978.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_01T21_00_40.724978", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-01T21-00-40.724978.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-01T21-00-40.724978.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_01T21_00_40.724978", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T21-00-40.724978.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T21-00-40.724978.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_01T21_00_40.724978", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-01T21-00-40.724978.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-01T21-00-40.724978.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_01T21_00_40.724978", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T21-00-40.724978.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T21-00-40.724978.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_01T21_00_40.724978", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T21-00-40.724978.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T21-00-40.724978.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_01T21_00_40.724978", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T21-00-40.724978.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T21-00-40.724978.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_01T21_00_40.724978", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-01T21-00-40.724978.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-01T21-00-40.724978.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_01T21_00_40.724978", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-01T21-00-40.724978.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-01T21-00-40.724978.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_01T21_00_40.724978", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T21-00-40.724978.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T21-00-40.724978.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_01T21_00_40.724978", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T21-00-40.724978.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T21-00-40.724978.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_01T21_00_40.724978", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T21-00-40.724978.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T21-00-40.724978.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_01T21_00_40.724978", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T21-00-40.724978.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T21-00-40.724978.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_01T21_00_40.724978", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-01T21-00-40.724978.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-01T21-00-40.724978.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_01T21_00_40.724978", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-01T21-00-40.724978.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-01T21-00-40.724978.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_01T21_00_40.724978", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-01T21-00-40.724978.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-01T21-00-40.724978.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_01T21_00_40.724978", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T21-00-40.724978.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T21-00-40.724978.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_01T21_00_40.724978", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-01T21-00-40.724978.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-01T21-00-40.724978.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_01T21_00_40.724978", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T21-00-40.724978.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T21-00-40.724978.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_01T21_00_40.724978", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T21-00-40.724978.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T21-00-40.724978.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_01T21_00_40.724978", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-01T21-00-40.724978.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-01T21-00-40.724978.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_01T21_00_40.724978", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-01T21-00-40.724978.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-01T21-00-40.724978.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_01T21_00_40.724978", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-01T21-00-40.724978.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-01T21-00-40.724978.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_01T21_00_40.724978", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T21-00-40.724978.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T21-00-40.724978.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_01T21_00_40.724978", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-01T21-00-40.724978.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-01T21-00-40.724978.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_01T21_00_40.724978", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-01T21-00-40.724978.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-01T21-00-40.724978.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_01T21_00_40.724978", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-01T21-00-40.724978.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-01T21-00-40.724978.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_01T21_00_40.724978", "path": ["**/details_harness|winogrande|5_2024-02-01T21-00-40.724978.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-01T21-00-40.724978.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_01T21_00_40.724978", "path": ["results_2024-02-01T21-00-40.724978.parquet"]}, {"split": "latest", "path": ["results_2024-02-01T21-00-40.724978.parquet"]}]}]}
2024-02-01T21:03:31+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of fhai50032/RolePlayLake-7B Dataset automatically created during the evaluation run of model fhai50032/RolePlayLake-7B on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-02-01T21:00:40.724978(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of fhai50032/RolePlayLake-7B\n\n\n\nDataset automatically created during the evaluation run of model fhai50032/RolePlayLake-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-02-01T21:00:40.724978(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of fhai50032/RolePlayLake-7B\n\n\n\nDataset automatically created during the evaluation run of model fhai50032/RolePlayLake-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-02-01T21:00:40.724978(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
f9740cc650677dd1fca8e47abe0819b83fb6e405
# Thomas Rowlandson Hand-Colored Etchings Collection Welcome to the Thomas Rowlandson Hand-Colored Etchings Collection, a comprehensive dataset of public domain artworks by the renowned British artist Thomas Rowlandson, featuring vibrant, hand-colored etchings. Sourced from the National Gallery of Art, this dataset is enhanced with captions generated by GPT-Vision and is designed for training AI models in recognizing, understanding, and generating art-related imagery. [![Discord](https://img.shields.io/discord/1091306623819059300?color=7289da&label=Discord&logo=discord&logoColor=fff&style=for-the-badge)](https://discord.com/invite/m3TBB9XEkb) ## Dataset Overview - **Content**: This collection showcases 80 hand-selected, hand-colored etchings by Thomas Rowlandson, offering a glimpse into 18th and early 19th-century British society, culture, and humor. The artworks depict a wide array of subjects, from social satires to picturesque landscapes, providing a diverse range of scenes for analysis. - **Source**: The etchings, now in the public domain, are accessed from the National Gallery of Art, curated to include descriptive captions for each piece, thus making it a unique resource for AI training and art historical education. - **Usage**: Aimed at AI model training, this dataset can be utilized for tasks such as art style analysis, historical context learning, pattern recognition in art, and the generation of art-inspired images. ## Licensing - The hand-colored etchings by Thomas Rowlandson, sourced from the National Gallery of Art, are in the public domain. The curated dataset, along with GPT-Vision generated captions, is available under the Creative Commons Attribution-NonCommercial 2.0 Generic (CC BY-NC 2.0) license. This license allows for non-commercial use, requiring attribution and prohibiting commercial exploitation. - For more information on this license, please visit [CC BY-NC 2.0 License details](https://creativecommons.org/licenses/by-nc/2.0/). ## Dataset Composition Each artwork in the dataset is paired with a caption designed to optimize AI training, incorporating techniques such as token shuffling. This fusion of historical art and contemporary AI technology offers a valuable resource for developers, researchers, and art historians. ## How to Use the Collection 1. **Download the Collection**: Access the collection through the provided link for non-commercial purposes related to AI model training. 2. **Explore Artworks and Captions**: Delve into the collection to explore Rowlandson's diverse etchings and the accompanying detailed captions. 3. **Apply in AI Training**: Use the dataset to train AI models, leveraging the rich captions to enhance models' understanding of art history and stylistic nuances. ## Contributions and Feedback Your feedback and contributions are highly appreciated. If you wish to offer feedback or contribute additional images or captions to enrich the collection, please contact us. Your involvement helps to continually refine this dataset for the benefit of the AI, art, and historical research communities. ## Related For insights into ethical approaches to AI model training and the use of art datasets, visit [Crafting the Future: Blibla's Ethical Approach to AI Model Training](https://blib.la/blog/crafting-the-future-blibla-s-ethical-approach-to-ai-model-training). --- The Thomas Rowlandson Hand-Colored Etchings Collection stands as an invaluable tool for advancing AI's grasp of art and history, promising to be a cornerstone resource in your AI projects. ---
Blib-la/thomas_rowlandson_dataset
[ "license:cc-by-nc-2.0", "region:us" ]
2024-02-01T21:03:02+00:00
{"license": "cc-by-nc-2.0", "viewer": false}
2024-02-02T04:30:50+00:00
[]
[]
TAGS #license-cc-by-nc-2.0 #region-us
# Thomas Rowlandson Hand-Colored Etchings Collection Welcome to the Thomas Rowlandson Hand-Colored Etchings Collection, a comprehensive dataset of public domain artworks by the renowned British artist Thomas Rowlandson, featuring vibrant, hand-colored etchings. Sourced from the National Gallery of Art, this dataset is enhanced with captions generated by GPT-Vision and is designed for training AI models in recognizing, understanding, and generating art-related imagery. ![Discord](URL ## Dataset Overview - Content: This collection showcases 80 hand-selected, hand-colored etchings by Thomas Rowlandson, offering a glimpse into 18th and early 19th-century British society, culture, and humor. The artworks depict a wide array of subjects, from social satires to picturesque landscapes, providing a diverse range of scenes for analysis. - Source: The etchings, now in the public domain, are accessed from the National Gallery of Art, curated to include descriptive captions for each piece, thus making it a unique resource for AI training and art historical education. - Usage: Aimed at AI model training, this dataset can be utilized for tasks such as art style analysis, historical context learning, pattern recognition in art, and the generation of art-inspired images. ## Licensing - The hand-colored etchings by Thomas Rowlandson, sourced from the National Gallery of Art, are in the public domain. The curated dataset, along with GPT-Vision generated captions, is available under the Creative Commons Attribution-NonCommercial 2.0 Generic (CC BY-NC 2.0) license. This license allows for non-commercial use, requiring attribution and prohibiting commercial exploitation. - For more information on this license, please visit CC BY-NC 2.0 License details. ## Dataset Composition Each artwork in the dataset is paired with a caption designed to optimize AI training, incorporating techniques such as token shuffling. This fusion of historical art and contemporary AI technology offers a valuable resource for developers, researchers, and art historians. ## How to Use the Collection 1. Download the Collection: Access the collection through the provided link for non-commercial purposes related to AI model training. 2. Explore Artworks and Captions: Delve into the collection to explore Rowlandson's diverse etchings and the accompanying detailed captions. 3. Apply in AI Training: Use the dataset to train AI models, leveraging the rich captions to enhance models' understanding of art history and stylistic nuances. ## Contributions and Feedback Your feedback and contributions are highly appreciated. If you wish to offer feedback or contribute additional images or captions to enrich the collection, please contact us. Your involvement helps to continually refine this dataset for the benefit of the AI, art, and historical research communities. ## Related For insights into ethical approaches to AI model training and the use of art datasets, visit Crafting the Future: Blibla's Ethical Approach to AI Model Training. --- The Thomas Rowlandson Hand-Colored Etchings Collection stands as an invaluable tool for advancing AI's grasp of art and history, promising to be a cornerstone resource in your AI projects. ---
[ "# Thomas Rowlandson Hand-Colored Etchings Collection\n\nWelcome to the Thomas Rowlandson Hand-Colored Etchings Collection, a comprehensive dataset of public domain artworks by the renowned British artist Thomas Rowlandson, featuring vibrant, hand-colored etchings. Sourced from the National Gallery of Art, this dataset is enhanced with captions generated by GPT-Vision and is designed for training AI models in recognizing, understanding, and generating art-related imagery.\n\n![Discord](URL", "## Dataset Overview\n\n- Content: This collection showcases 80 hand-selected, hand-colored etchings by Thomas Rowlandson, offering a glimpse into 18th and early 19th-century British society, culture, and humor. The artworks depict a wide array of subjects, from social satires to picturesque landscapes, providing a diverse range of scenes for analysis.\n- Source: The etchings, now in the public domain, are accessed from the National Gallery of Art, curated to include descriptive captions for each piece, thus making it a unique resource for AI training and art historical education.\n- Usage: Aimed at AI model training, this dataset can be utilized for tasks such as art style analysis, historical context learning, pattern recognition in art, and the generation of art-inspired images.", "## Licensing\n\n- The hand-colored etchings by Thomas Rowlandson, sourced from the National Gallery of Art, are in the public domain. The curated dataset, along with GPT-Vision generated captions, is available under the Creative Commons Attribution-NonCommercial 2.0 Generic (CC BY-NC 2.0) license. This license allows for non-commercial use, requiring attribution and prohibiting commercial exploitation.\n- For more information on this license, please visit CC BY-NC 2.0 License details.", "## Dataset Composition\n\nEach artwork in the dataset is paired with a caption designed to optimize AI training, incorporating techniques such as token shuffling. This fusion of historical art and contemporary AI technology offers a valuable resource for developers, researchers, and art historians.", "## How to Use the Collection\n\n1. Download the Collection: Access the collection through the provided link for non-commercial purposes related to AI model training.\n2. Explore Artworks and Captions: Delve into the collection to explore Rowlandson's diverse etchings and the accompanying detailed captions.\n3. Apply in AI Training: Use the dataset to train AI models, leveraging the rich captions to enhance models' understanding of art history and stylistic nuances.", "## Contributions and Feedback\n\nYour feedback and contributions are highly appreciated. If you wish to offer feedback or contribute additional images or captions to enrich the collection, please contact us. Your involvement helps to continually refine this dataset for the benefit of the AI, art, and historical research communities.", "## Related\n\nFor insights into ethical approaches to AI model training and the use of art datasets, visit Crafting the Future: Blibla's Ethical Approach to AI Model Training.\n\n---\n\nThe Thomas Rowlandson Hand-Colored Etchings Collection stands as an invaluable tool for advancing AI's grasp of art and history, promising to be a cornerstone resource in your AI projects.\n\n---" ]
[ "TAGS\n#license-cc-by-nc-2.0 #region-us \n", "# Thomas Rowlandson Hand-Colored Etchings Collection\n\nWelcome to the Thomas Rowlandson Hand-Colored Etchings Collection, a comprehensive dataset of public domain artworks by the renowned British artist Thomas Rowlandson, featuring vibrant, hand-colored etchings. Sourced from the National Gallery of Art, this dataset is enhanced with captions generated by GPT-Vision and is designed for training AI models in recognizing, understanding, and generating art-related imagery.\n\n![Discord](URL", "## Dataset Overview\n\n- Content: This collection showcases 80 hand-selected, hand-colored etchings by Thomas Rowlandson, offering a glimpse into 18th and early 19th-century British society, culture, and humor. The artworks depict a wide array of subjects, from social satires to picturesque landscapes, providing a diverse range of scenes for analysis.\n- Source: The etchings, now in the public domain, are accessed from the National Gallery of Art, curated to include descriptive captions for each piece, thus making it a unique resource for AI training and art historical education.\n- Usage: Aimed at AI model training, this dataset can be utilized for tasks such as art style analysis, historical context learning, pattern recognition in art, and the generation of art-inspired images.", "## Licensing\n\n- The hand-colored etchings by Thomas Rowlandson, sourced from the National Gallery of Art, are in the public domain. The curated dataset, along with GPT-Vision generated captions, is available under the Creative Commons Attribution-NonCommercial 2.0 Generic (CC BY-NC 2.0) license. This license allows for non-commercial use, requiring attribution and prohibiting commercial exploitation.\n- For more information on this license, please visit CC BY-NC 2.0 License details.", "## Dataset Composition\n\nEach artwork in the dataset is paired with a caption designed to optimize AI training, incorporating techniques such as token shuffling. This fusion of historical art and contemporary AI technology offers a valuable resource for developers, researchers, and art historians.", "## How to Use the Collection\n\n1. Download the Collection: Access the collection through the provided link for non-commercial purposes related to AI model training.\n2. Explore Artworks and Captions: Delve into the collection to explore Rowlandson's diverse etchings and the accompanying detailed captions.\n3. Apply in AI Training: Use the dataset to train AI models, leveraging the rich captions to enhance models' understanding of art history and stylistic nuances.", "## Contributions and Feedback\n\nYour feedback and contributions are highly appreciated. If you wish to offer feedback or contribute additional images or captions to enrich the collection, please contact us. Your involvement helps to continually refine this dataset for the benefit of the AI, art, and historical research communities.", "## Related\n\nFor insights into ethical approaches to AI model training and the use of art datasets, visit Crafting the Future: Blibla's Ethical Approach to AI Model Training.\n\n---\n\nThe Thomas Rowlandson Hand-Colored Etchings Collection stands as an invaluable tool for advancing AI's grasp of art and history, promising to be a cornerstone resource in your AI projects.\n\n---" ]
3cb5d01b474c3cc8a2211c09b62b33e6a197d963
# Remilio [Redacted Remilio Babies](https://remilio.org/) is a collection of 10,000 neochibi pfpNFT's evolving the proven Milady Maker paradigm with the introduction of young J.I.T. energy, schizophrenic reactionary aesthetics, and digital sales terrorism. ![image/png](https://cdn-uploads.huggingface.co/production/uploads/643ae6350e5495afdefb26e1/18r2_9iN3SmU-ME_He31N.png)
hayden-donnelly/remilio
[ "task_categories:image-classification", "task_categories:unconditional-image-generation", "task_categories:text-to-image", "size_categories:1K<n<10K", "language:en", "license:other", "region:us" ]
2024-02-01T21:07:30+00:00
{"language": ["en"], "license": "other", "size_categories": ["1K<n<10K"], "task_categories": ["image-classification", "unconditional-image-generation", "text-to-image"], "pretty_name": "Remilio", "license_name": "viral-public-license", "license_link": "LICENSE"}
2024-02-02T08:29:10+00:00
[]
[ "en" ]
TAGS #task_categories-image-classification #task_categories-unconditional-image-generation #task_categories-text-to-image #size_categories-1K<n<10K #language-English #license-other #region-us
# Remilio Redacted Remilio Babies is a collection of 10,000 neochibi pfpNFT's evolving the proven Milady Maker paradigm with the introduction of young J.I.T. energy, schizophrenic reactionary aesthetics, and digital sales terrorism. !image/png
[ "# Remilio\n\nRedacted Remilio Babies is a collection of 10,000 neochibi pfpNFT's evolving the \nproven Milady Maker paradigm with the introduction of young J.I.T. energy, schizophrenic reactionary aesthetics, \nand digital sales terrorism.\n\n!image/png" ]
[ "TAGS\n#task_categories-image-classification #task_categories-unconditional-image-generation #task_categories-text-to-image #size_categories-1K<n<10K #language-English #license-other #region-us \n", "# Remilio\n\nRedacted Remilio Babies is a collection of 10,000 neochibi pfpNFT's evolving the \nproven Milady Maker paradigm with the introduction of young J.I.T. energy, schizophrenic reactionary aesthetics, \nand digital sales terrorism.\n\n!image/png" ]
3cc4cd16d629fba8856307dfaea1ed75aeffa298
# Dataset Card for Evaluation run of cognitivecomputations/TinyDolphin-2.8.2-1.1b-laser <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [cognitivecomputations/TinyDolphin-2.8.2-1.1b-laser](https://huggingface.co/cognitivecomputations/TinyDolphin-2.8.2-1.1b-laser) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_cognitivecomputations__TinyDolphin-2.8.2-1.1b-laser", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-02-01T21:09:52.023664](https://huggingface.co/datasets/open-llm-leaderboard/details_cognitivecomputations__TinyDolphin-2.8.2-1.1b-laser/blob/main/results_2024-02-01T21-09-52.023664.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.26487177878693896, "acc_stderr": 0.031083173918083885, "acc_norm": 0.26611351733798344, "acc_norm_stderr": 0.0318546335977903, "mc1": 0.2252141982864137, "mc1_stderr": 0.014623240768023507, "mc2": 0.36332154287207935, "mc2_stderr": 0.014014442507659016 }, "harness|arc:challenge|25": { "acc": 0.3097269624573379, "acc_stderr": 0.013512058415238361, "acc_norm": 0.33361774744027306, "acc_norm_stderr": 0.013778687054176538 }, "harness|hellaswag|10": { "acc": 0.45140410276837284, "acc_stderr": 0.004966158142645414, "acc_norm": 0.5853415654252141, "acc_norm_stderr": 0.004916561213591292 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.23, "acc_stderr": 0.04229525846816505, "acc_norm": 0.23, "acc_norm_stderr": 0.04229525846816505 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.32592592592592595, "acc_stderr": 0.040491220417025055, "acc_norm": 0.32592592592592595, "acc_norm_stderr": 0.040491220417025055 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.19736842105263158, "acc_stderr": 0.03238981601699397, "acc_norm": 0.19736842105263158, "acc_norm_stderr": 0.03238981601699397 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.24, "acc_stderr": 0.042923469599092816, "acc_norm": 0.24, "acc_norm_stderr": 0.042923469599092816 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.25660377358490566, "acc_stderr": 0.026880647889051975, "acc_norm": 0.25660377358490566, "acc_norm_stderr": 0.026880647889051975 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.18055555555555555, "acc_stderr": 0.03216600808802269, "acc_norm": 0.18055555555555555, "acc_norm_stderr": 0.03216600808802269 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.23, "acc_stderr": 0.04229525846816506, "acc_norm": 0.23, "acc_norm_stderr": 0.04229525846816506 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.36, "acc_stderr": 0.04824181513244218, "acc_norm": 0.36, "acc_norm_stderr": 0.04824181513244218 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.23, "acc_stderr": 0.04229525846816505, "acc_norm": 0.23, "acc_norm_stderr": 0.04229525846816505 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.23121387283236994, "acc_stderr": 0.0321473730202947, "acc_norm": 0.23121387283236994, "acc_norm_stderr": 0.0321473730202947 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.22549019607843138, "acc_stderr": 0.04158307533083286, "acc_norm": 0.22549019607843138, "acc_norm_stderr": 0.04158307533083286 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.29, "acc_stderr": 0.045604802157206845, "acc_norm": 0.29, "acc_norm_stderr": 0.045604802157206845 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.3148936170212766, "acc_stderr": 0.030363582197238153, "acc_norm": 0.3148936170212766, "acc_norm_stderr": 0.030363582197238153 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.2543859649122807, "acc_stderr": 0.040969851398436716, "acc_norm": 0.2543859649122807, "acc_norm_stderr": 0.040969851398436716 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.2413793103448276, "acc_stderr": 0.03565998174135302, "acc_norm": 0.2413793103448276, "acc_norm_stderr": 0.03565998174135302 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.26455026455026454, "acc_stderr": 0.022717467897708614, "acc_norm": 0.26455026455026454, "acc_norm_stderr": 0.022717467897708614 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.30952380952380953, "acc_stderr": 0.04134913018303316, "acc_norm": 0.30952380952380953, "acc_norm_stderr": 0.04134913018303316 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.28, "acc_stderr": 0.04512608598542127, "acc_norm": 0.28, "acc_norm_stderr": 0.04512608598542127 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.25806451612903225, "acc_stderr": 0.024892469172462833, "acc_norm": 0.25806451612903225, "acc_norm_stderr": 0.024892469172462833 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.270935960591133, "acc_stderr": 0.03127090713297698, "acc_norm": 0.270935960591133, "acc_norm_stderr": 0.03127090713297698 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.28, "acc_stderr": 0.04512608598542129, "acc_norm": 0.28, "acc_norm_stderr": 0.04512608598542129 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.21818181818181817, "acc_stderr": 0.03225078108306289, "acc_norm": 0.21818181818181817, "acc_norm_stderr": 0.03225078108306289 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.23232323232323232, "acc_stderr": 0.030088629490217483, "acc_norm": 0.23232323232323232, "acc_norm_stderr": 0.030088629490217483 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.26424870466321243, "acc_stderr": 0.03182155050916646, "acc_norm": 0.26424870466321243, "acc_norm_stderr": 0.03182155050916646 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.31025641025641026, "acc_stderr": 0.023454674889404288, "acc_norm": 0.31025641025641026, "acc_norm_stderr": 0.023454674889404288 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.24074074074074073, "acc_stderr": 0.026067159222275805, "acc_norm": 0.24074074074074073, "acc_norm_stderr": 0.026067159222275805 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.2689075630252101, "acc_stderr": 0.028801392193631276, "acc_norm": 0.2689075630252101, "acc_norm_stderr": 0.028801392193631276 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.2582781456953642, "acc_stderr": 0.035737053147634576, "acc_norm": 0.2582781456953642, "acc_norm_stderr": 0.035737053147634576 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.27339449541284405, "acc_stderr": 0.01910929984609828, "acc_norm": 0.27339449541284405, "acc_norm_stderr": 0.01910929984609828 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.32407407407407407, "acc_stderr": 0.03191923445686185, "acc_norm": 0.32407407407407407, "acc_norm_stderr": 0.03191923445686185 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.2696078431372549, "acc_stderr": 0.03114557065948678, "acc_norm": 0.2696078431372549, "acc_norm_stderr": 0.03114557065948678 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.26582278481012656, "acc_stderr": 0.028756799629658335, "acc_norm": 0.26582278481012656, "acc_norm_stderr": 0.028756799629658335 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.31390134529147984, "acc_stderr": 0.031146796482972465, "acc_norm": 0.31390134529147984, "acc_norm_stderr": 0.031146796482972465 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.21374045801526717, "acc_stderr": 0.0359546161177469, "acc_norm": 0.21374045801526717, "acc_norm_stderr": 0.0359546161177469 }, "harness|hendrycksTest-international_law|5": { "acc": 0.34710743801652894, "acc_stderr": 0.04345724570292535, "acc_norm": 0.34710743801652894, "acc_norm_stderr": 0.04345724570292535 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.2037037037037037, "acc_stderr": 0.03893542518824849, "acc_norm": 0.2037037037037037, "acc_norm_stderr": 0.03893542518824849 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.2392638036809816, "acc_stderr": 0.033519538795212696, "acc_norm": 0.2392638036809816, "acc_norm_stderr": 0.033519538795212696 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.2857142857142857, "acc_stderr": 0.04287858751340456, "acc_norm": 0.2857142857142857, "acc_norm_stderr": 0.04287858751340456 }, "harness|hendrycksTest-management|5": { "acc": 0.2621359223300971, "acc_stderr": 0.043546310772605956, "acc_norm": 0.2621359223300971, "acc_norm_stderr": 0.043546310772605956 }, "harness|hendrycksTest-marketing|5": { "acc": 0.25213675213675213, "acc_stderr": 0.02844796547623102, "acc_norm": 0.25213675213675213, "acc_norm_stderr": 0.02844796547623102 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.26, "acc_stderr": 0.0440844002276808, "acc_norm": 0.26, "acc_norm_stderr": 0.0440844002276808 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.23371647509578544, "acc_stderr": 0.015133383278988825, "acc_norm": 0.23371647509578544, "acc_norm_stderr": 0.015133383278988825 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.28901734104046245, "acc_stderr": 0.02440517393578323, "acc_norm": 0.28901734104046245, "acc_norm_stderr": 0.02440517393578323 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.24692737430167597, "acc_stderr": 0.014422292204808835, "acc_norm": 0.24692737430167597, "acc_norm_stderr": 0.014422292204808835 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.2581699346405229, "acc_stderr": 0.025058503316958143, "acc_norm": 0.2581699346405229, "acc_norm_stderr": 0.025058503316958143 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.2990353697749196, "acc_stderr": 0.026003301117885142, "acc_norm": 0.2990353697749196, "acc_norm_stderr": 0.026003301117885142 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.26851851851851855, "acc_stderr": 0.02465968518596729, "acc_norm": 0.26851851851851855, "acc_norm_stderr": 0.02465968518596729 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.23049645390070922, "acc_stderr": 0.0251237392268724, "acc_norm": 0.23049645390070922, "acc_norm_stderr": 0.0251237392268724 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.2607561929595828, "acc_stderr": 0.011213471559602325, "acc_norm": 0.2607561929595828, "acc_norm_stderr": 0.011213471559602325 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.18382352941176472, "acc_stderr": 0.023529242185193106, "acc_norm": 0.18382352941176472, "acc_norm_stderr": 0.023529242185193106 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.28594771241830064, "acc_stderr": 0.01828048507295467, "acc_norm": 0.28594771241830064, "acc_norm_stderr": 0.01828048507295467 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.3181818181818182, "acc_stderr": 0.04461272175910508, "acc_norm": 0.3181818181818182, "acc_norm_stderr": 0.04461272175910508 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.16326530612244897, "acc_stderr": 0.023661699177098615, "acc_norm": 0.16326530612244897, "acc_norm_stderr": 0.023661699177098615 }, "harness|hendrycksTest-sociology|5": { "acc": 0.26865671641791045, "acc_stderr": 0.03134328358208954, "acc_norm": 0.26865671641791045, "acc_norm_stderr": 0.03134328358208954 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.16, "acc_stderr": 0.03684529491774709, "acc_norm": 0.16, "acc_norm_stderr": 0.03684529491774709 }, "harness|hendrycksTest-virology|5": { "acc": 0.29518072289156627, "acc_stderr": 0.035509201856896294, "acc_norm": 0.29518072289156627, "acc_norm_stderr": 0.035509201856896294 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.24561403508771928, "acc_stderr": 0.03301405946987249, "acc_norm": 0.24561403508771928, "acc_norm_stderr": 0.03301405946987249 }, "harness|truthfulqa:mc|0": { "mc1": 0.2252141982864137, "mc1_stderr": 0.014623240768023507, "mc2": 0.36332154287207935, "mc2_stderr": 0.014014442507659016 }, "harness|winogrande|5": { "acc": 0.601420678768745, "acc_stderr": 0.013760357176873836 }, "harness|gsm8k|5": { "acc": 0.01288855193328279, "acc_stderr": 0.003106901266499662 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_cognitivecomputations__TinyDolphin-2.8.2-1.1b-laser
[ "region:us" ]
2024-02-01T21:12:13+00:00
{"pretty_name": "Evaluation run of cognitivecomputations/TinyDolphin-2.8.2-1.1b-laser", "dataset_summary": "Dataset automatically created during the evaluation run of model [cognitivecomputations/TinyDolphin-2.8.2-1.1b-laser](https://huggingface.co/cognitivecomputations/TinyDolphin-2.8.2-1.1b-laser) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_cognitivecomputations__TinyDolphin-2.8.2-1.1b-laser\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-01T21:09:52.023664](https://huggingface.co/datasets/open-llm-leaderboard/details_cognitivecomputations__TinyDolphin-2.8.2-1.1b-laser/blob/main/results_2024-02-01T21-09-52.023664.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.26487177878693896,\n \"acc_stderr\": 0.031083173918083885,\n \"acc_norm\": 0.26611351733798344,\n \"acc_norm_stderr\": 0.0318546335977903,\n \"mc1\": 0.2252141982864137,\n \"mc1_stderr\": 0.014623240768023507,\n \"mc2\": 0.36332154287207935,\n \"mc2_stderr\": 0.014014442507659016\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.3097269624573379,\n \"acc_stderr\": 0.013512058415238361,\n \"acc_norm\": 0.33361774744027306,\n \"acc_norm_stderr\": 0.013778687054176538\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.45140410276837284,\n \"acc_stderr\": 0.004966158142645414,\n \"acc_norm\": 0.5853415654252141,\n \"acc_norm_stderr\": 0.004916561213591292\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816505,\n \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816505\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.32592592592592595,\n \"acc_stderr\": 0.040491220417025055,\n \"acc_norm\": 0.32592592592592595,\n \"acc_norm_stderr\": 0.040491220417025055\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.19736842105263158,\n \"acc_stderr\": 0.03238981601699397,\n \"acc_norm\": 0.19736842105263158,\n \"acc_norm_stderr\": 0.03238981601699397\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.24,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.25660377358490566,\n \"acc_stderr\": 0.026880647889051975,\n \"acc_norm\": 0.25660377358490566,\n \"acc_norm_stderr\": 0.026880647889051975\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.18055555555555555,\n \"acc_stderr\": 0.03216600808802269,\n \"acc_norm\": 0.18055555555555555,\n \"acc_norm_stderr\": 0.03216600808802269\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816506,\n \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816506\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816505,\n \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816505\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.23121387283236994,\n \"acc_stderr\": 0.0321473730202947,\n \"acc_norm\": 0.23121387283236994,\n \"acc_norm_stderr\": 0.0321473730202947\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.04158307533083286,\n \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.04158307533083286\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.3148936170212766,\n \"acc_stderr\": 0.030363582197238153,\n \"acc_norm\": 0.3148936170212766,\n \"acc_norm_stderr\": 0.030363582197238153\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2543859649122807,\n \"acc_stderr\": 0.040969851398436716,\n \"acc_norm\": 0.2543859649122807,\n \"acc_norm_stderr\": 0.040969851398436716\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.2413793103448276,\n \"acc_stderr\": 0.03565998174135302,\n \"acc_norm\": 0.2413793103448276,\n \"acc_norm_stderr\": 0.03565998174135302\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.26455026455026454,\n \"acc_stderr\": 0.022717467897708614,\n \"acc_norm\": 0.26455026455026454,\n \"acc_norm_stderr\": 0.022717467897708614\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.30952380952380953,\n \"acc_stderr\": 0.04134913018303316,\n \"acc_norm\": 0.30952380952380953,\n \"acc_norm_stderr\": 0.04134913018303316\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.25806451612903225,\n \"acc_stderr\": 0.024892469172462833,\n \"acc_norm\": 0.25806451612903225,\n \"acc_norm_stderr\": 0.024892469172462833\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.270935960591133,\n \"acc_stderr\": 0.03127090713297698,\n \"acc_norm\": 0.270935960591133,\n \"acc_norm_stderr\": 0.03127090713297698\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542129,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542129\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03225078108306289,\n \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03225078108306289\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.23232323232323232,\n \"acc_stderr\": 0.030088629490217483,\n \"acc_norm\": 0.23232323232323232,\n \"acc_norm_stderr\": 0.030088629490217483\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.26424870466321243,\n \"acc_stderr\": 0.03182155050916646,\n \"acc_norm\": 0.26424870466321243,\n \"acc_norm_stderr\": 0.03182155050916646\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.31025641025641026,\n \"acc_stderr\": 0.023454674889404288,\n \"acc_norm\": 0.31025641025641026,\n \"acc_norm_stderr\": 0.023454674889404288\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.24074074074074073,\n \"acc_stderr\": 0.026067159222275805,\n \"acc_norm\": 0.24074074074074073,\n \"acc_norm_stderr\": 0.026067159222275805\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.2689075630252101,\n \"acc_stderr\": 0.028801392193631276,\n \"acc_norm\": 0.2689075630252101,\n \"acc_norm_stderr\": 0.028801392193631276\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.2582781456953642,\n \"acc_stderr\": 0.035737053147634576,\n \"acc_norm\": 0.2582781456953642,\n \"acc_norm_stderr\": 0.035737053147634576\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.27339449541284405,\n \"acc_stderr\": 0.01910929984609828,\n \"acc_norm\": 0.27339449541284405,\n \"acc_norm_stderr\": 0.01910929984609828\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.32407407407407407,\n \"acc_stderr\": 0.03191923445686185,\n \"acc_norm\": 0.32407407407407407,\n \"acc_norm_stderr\": 0.03191923445686185\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.2696078431372549,\n \"acc_stderr\": 0.03114557065948678,\n \"acc_norm\": 0.2696078431372549,\n \"acc_norm_stderr\": 0.03114557065948678\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.26582278481012656,\n \"acc_stderr\": 0.028756799629658335,\n \"acc_norm\": 0.26582278481012656,\n \"acc_norm_stderr\": 0.028756799629658335\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.31390134529147984,\n \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.31390134529147984,\n \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.21374045801526717,\n \"acc_stderr\": 0.0359546161177469,\n \"acc_norm\": 0.21374045801526717,\n \"acc_norm_stderr\": 0.0359546161177469\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.34710743801652894,\n \"acc_stderr\": 0.04345724570292535,\n \"acc_norm\": 0.34710743801652894,\n \"acc_norm_stderr\": 0.04345724570292535\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.2037037037037037,\n \"acc_stderr\": 0.03893542518824849,\n \"acc_norm\": 0.2037037037037037,\n \"acc_norm_stderr\": 0.03893542518824849\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.2392638036809816,\n \"acc_stderr\": 0.033519538795212696,\n \"acc_norm\": 0.2392638036809816,\n \"acc_norm_stderr\": 0.033519538795212696\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.2857142857142857,\n \"acc_stderr\": 0.04287858751340456,\n \"acc_norm\": 0.2857142857142857,\n \"acc_norm_stderr\": 0.04287858751340456\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.2621359223300971,\n \"acc_stderr\": 0.043546310772605956,\n \"acc_norm\": 0.2621359223300971,\n \"acc_norm_stderr\": 0.043546310772605956\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.25213675213675213,\n \"acc_stderr\": 0.02844796547623102,\n \"acc_norm\": 0.25213675213675213,\n \"acc_norm_stderr\": 0.02844796547623102\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.23371647509578544,\n \"acc_stderr\": 0.015133383278988825,\n \"acc_norm\": 0.23371647509578544,\n \"acc_norm_stderr\": 0.015133383278988825\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.28901734104046245,\n \"acc_stderr\": 0.02440517393578323,\n \"acc_norm\": 0.28901734104046245,\n \"acc_norm_stderr\": 0.02440517393578323\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24692737430167597,\n \"acc_stderr\": 0.014422292204808835,\n \"acc_norm\": 0.24692737430167597,\n \"acc_norm_stderr\": 0.014422292204808835\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.2581699346405229,\n \"acc_stderr\": 0.025058503316958143,\n \"acc_norm\": 0.2581699346405229,\n \"acc_norm_stderr\": 0.025058503316958143\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2990353697749196,\n \"acc_stderr\": 0.026003301117885142,\n \"acc_norm\": 0.2990353697749196,\n \"acc_norm_stderr\": 0.026003301117885142\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.26851851851851855,\n \"acc_stderr\": 0.02465968518596729,\n \"acc_norm\": 0.26851851851851855,\n \"acc_norm_stderr\": 0.02465968518596729\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.23049645390070922,\n \"acc_stderr\": 0.0251237392268724,\n \"acc_norm\": 0.23049645390070922,\n \"acc_norm_stderr\": 0.0251237392268724\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2607561929595828,\n \"acc_stderr\": 0.011213471559602325,\n \"acc_norm\": 0.2607561929595828,\n \"acc_norm_stderr\": 0.011213471559602325\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.18382352941176472,\n \"acc_stderr\": 0.023529242185193106,\n \"acc_norm\": 0.18382352941176472,\n \"acc_norm_stderr\": 0.023529242185193106\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.28594771241830064,\n \"acc_stderr\": 0.01828048507295467,\n \"acc_norm\": 0.28594771241830064,\n \"acc_norm_stderr\": 0.01828048507295467\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.3181818181818182,\n \"acc_stderr\": 0.04461272175910508,\n \"acc_norm\": 0.3181818181818182,\n \"acc_norm_stderr\": 0.04461272175910508\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.16326530612244897,\n \"acc_stderr\": 0.023661699177098615,\n \"acc_norm\": 0.16326530612244897,\n \"acc_norm_stderr\": 0.023661699177098615\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.26865671641791045,\n \"acc_stderr\": 0.03134328358208954,\n \"acc_norm\": 0.26865671641791045,\n \"acc_norm_stderr\": 0.03134328358208954\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.16,\n \"acc_stderr\": 0.03684529491774709,\n \"acc_norm\": 0.16,\n \"acc_norm_stderr\": 0.03684529491774709\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.29518072289156627,\n \"acc_stderr\": 0.035509201856896294,\n \"acc_norm\": 0.29518072289156627,\n \"acc_norm_stderr\": 0.035509201856896294\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.24561403508771928,\n \"acc_stderr\": 0.03301405946987249,\n \"acc_norm\": 0.24561403508771928,\n \"acc_norm_stderr\": 0.03301405946987249\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2252141982864137,\n \"mc1_stderr\": 0.014623240768023507,\n \"mc2\": 0.36332154287207935,\n \"mc2_stderr\": 0.014014442507659016\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.601420678768745,\n \"acc_stderr\": 0.013760357176873836\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.01288855193328279,\n \"acc_stderr\": 0.003106901266499662\n }\n}\n```", "repo_url": "https://huggingface.co/cognitivecomputations/TinyDolphin-2.8.2-1.1b-laser", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_01T21_09_52.023664", "path": ["**/details_harness|arc:challenge|25_2024-02-01T21-09-52.023664.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-01T21-09-52.023664.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_01T21_09_52.023664", "path": ["**/details_harness|gsm8k|5_2024-02-01T21-09-52.023664.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-01T21-09-52.023664.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_01T21_09_52.023664", "path": ["**/details_harness|hellaswag|10_2024-02-01T21-09-52.023664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-01T21-09-52.023664.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_01T21_09_52.023664", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T21-09-52.023664.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-01T21-09-52.023664.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-01T21-09-52.023664.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T21-09-52.023664.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T21-09-52.023664.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-01T21-09-52.023664.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T21-09-52.023664.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T21-09-52.023664.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T21-09-52.023664.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T21-09-52.023664.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-01T21-09-52.023664.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-01T21-09-52.023664.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T21-09-52.023664.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-01T21-09-52.023664.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T21-09-52.023664.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T21-09-52.023664.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T21-09-52.023664.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-01T21-09-52.023664.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T21-09-52.023664.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T21-09-52.023664.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T21-09-52.023664.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T21-09-52.023664.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T21-09-52.023664.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T21-09-52.023664.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T21-09-52.023664.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T21-09-52.023664.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T21-09-52.023664.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T21-09-52.023664.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T21-09-52.023664.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T21-09-52.023664.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T21-09-52.023664.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T21-09-52.023664.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-01T21-09-52.023664.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T21-09-52.023664.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-01T21-09-52.023664.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T21-09-52.023664.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T21-09-52.023664.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T21-09-52.023664.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-01T21-09-52.023664.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-01T21-09-52.023664.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T21-09-52.023664.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T21-09-52.023664.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T21-09-52.023664.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T21-09-52.023664.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-01T21-09-52.023664.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-01T21-09-52.023664.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-01T21-09-52.023664.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T21-09-52.023664.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-01T21-09-52.023664.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T21-09-52.023664.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T21-09-52.023664.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-01T21-09-52.023664.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-01T21-09-52.023664.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-01T21-09-52.023664.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T21-09-52.023664.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-01T21-09-52.023664.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-01T21-09-52.023664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T21-09-52.023664.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-01T21-09-52.023664.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-01T21-09-52.023664.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T21-09-52.023664.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T21-09-52.023664.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-01T21-09-52.023664.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T21-09-52.023664.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T21-09-52.023664.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T21-09-52.023664.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T21-09-52.023664.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-01T21-09-52.023664.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-01T21-09-52.023664.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T21-09-52.023664.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-01T21-09-52.023664.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T21-09-52.023664.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T21-09-52.023664.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T21-09-52.023664.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-01T21-09-52.023664.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T21-09-52.023664.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T21-09-52.023664.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T21-09-52.023664.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T21-09-52.023664.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T21-09-52.023664.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T21-09-52.023664.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T21-09-52.023664.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T21-09-52.023664.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T21-09-52.023664.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T21-09-52.023664.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T21-09-52.023664.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T21-09-52.023664.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T21-09-52.023664.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T21-09-52.023664.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-01T21-09-52.023664.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T21-09-52.023664.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-01T21-09-52.023664.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T21-09-52.023664.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T21-09-52.023664.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T21-09-52.023664.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-01T21-09-52.023664.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-01T21-09-52.023664.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T21-09-52.023664.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T21-09-52.023664.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T21-09-52.023664.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T21-09-52.023664.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-01T21-09-52.023664.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-01T21-09-52.023664.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-01T21-09-52.023664.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T21-09-52.023664.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-01T21-09-52.023664.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T21-09-52.023664.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T21-09-52.023664.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-01T21-09-52.023664.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-01T21-09-52.023664.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-01T21-09-52.023664.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T21-09-52.023664.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-01T21-09-52.023664.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-01T21-09-52.023664.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_01T21_09_52.023664", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T21-09-52.023664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T21-09-52.023664.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_01T21_09_52.023664", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-01T21-09-52.023664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-01T21-09-52.023664.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_01T21_09_52.023664", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-01T21-09-52.023664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-01T21-09-52.023664.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_01T21_09_52.023664", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T21-09-52.023664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T21-09-52.023664.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_01T21_09_52.023664", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T21-09-52.023664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T21-09-52.023664.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_01T21_09_52.023664", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-01T21-09-52.023664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-01T21-09-52.023664.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_01T21_09_52.023664", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T21-09-52.023664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T21-09-52.023664.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_01T21_09_52.023664", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T21-09-52.023664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T21-09-52.023664.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_01T21_09_52.023664", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T21-09-52.023664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T21-09-52.023664.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_01T21_09_52.023664", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T21-09-52.023664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T21-09-52.023664.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_01T21_09_52.023664", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-01T21-09-52.023664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-01T21-09-52.023664.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_01T21_09_52.023664", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-01T21-09-52.023664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-01T21-09-52.023664.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_01T21_09_52.023664", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T21-09-52.023664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T21-09-52.023664.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_01T21_09_52.023664", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-01T21-09-52.023664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-01T21-09-52.023664.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_01T21_09_52.023664", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T21-09-52.023664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T21-09-52.023664.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_01T21_09_52.023664", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T21-09-52.023664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T21-09-52.023664.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_01T21_09_52.023664", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T21-09-52.023664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T21-09-52.023664.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_01T21_09_52.023664", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-01T21-09-52.023664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-01T21-09-52.023664.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_01T21_09_52.023664", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T21-09-52.023664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T21-09-52.023664.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_01T21_09_52.023664", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T21-09-52.023664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T21-09-52.023664.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_01T21_09_52.023664", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T21-09-52.023664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T21-09-52.023664.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_01T21_09_52.023664", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T21-09-52.023664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T21-09-52.023664.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_01T21_09_52.023664", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T21-09-52.023664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T21-09-52.023664.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_01T21_09_52.023664", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T21-09-52.023664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T21-09-52.023664.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_01T21_09_52.023664", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T21-09-52.023664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T21-09-52.023664.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_01T21_09_52.023664", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T21-09-52.023664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T21-09-52.023664.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_01T21_09_52.023664", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T21-09-52.023664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T21-09-52.023664.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_01T21_09_52.023664", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T21-09-52.023664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T21-09-52.023664.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_01T21_09_52.023664", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T21-09-52.023664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T21-09-52.023664.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_01T21_09_52.023664", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T21-09-52.023664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T21-09-52.023664.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_01T21_09_52.023664", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T21-09-52.023664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T21-09-52.023664.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_01T21_09_52.023664", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T21-09-52.023664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T21-09-52.023664.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_01T21_09_52.023664", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-01T21-09-52.023664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-01T21-09-52.023664.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_01T21_09_52.023664", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T21-09-52.023664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T21-09-52.023664.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_01T21_09_52.023664", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-01T21-09-52.023664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-01T21-09-52.023664.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_01T21_09_52.023664", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T21-09-52.023664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T21-09-52.023664.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_01T21_09_52.023664", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T21-09-52.023664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T21-09-52.023664.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_01T21_09_52.023664", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T21-09-52.023664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T21-09-52.023664.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_01T21_09_52.023664", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-01T21-09-52.023664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-01T21-09-52.023664.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_01T21_09_52.023664", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-01T21-09-52.023664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-01T21-09-52.023664.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_01T21_09_52.023664", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T21-09-52.023664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T21-09-52.023664.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_01T21_09_52.023664", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T21-09-52.023664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T21-09-52.023664.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_01T21_09_52.023664", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T21-09-52.023664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T21-09-52.023664.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_01T21_09_52.023664", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T21-09-52.023664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T21-09-52.023664.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_01T21_09_52.023664", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-01T21-09-52.023664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-01T21-09-52.023664.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_01T21_09_52.023664", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-01T21-09-52.023664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-01T21-09-52.023664.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_01T21_09_52.023664", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-01T21-09-52.023664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-01T21-09-52.023664.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_01T21_09_52.023664", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T21-09-52.023664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T21-09-52.023664.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_01T21_09_52.023664", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-01T21-09-52.023664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-01T21-09-52.023664.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_01T21_09_52.023664", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T21-09-52.023664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T21-09-52.023664.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_01T21_09_52.023664", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T21-09-52.023664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T21-09-52.023664.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_01T21_09_52.023664", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-01T21-09-52.023664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-01T21-09-52.023664.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_01T21_09_52.023664", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-01T21-09-52.023664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-01T21-09-52.023664.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_01T21_09_52.023664", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-01T21-09-52.023664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-01T21-09-52.023664.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_01T21_09_52.023664", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T21-09-52.023664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T21-09-52.023664.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_01T21_09_52.023664", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-01T21-09-52.023664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-01T21-09-52.023664.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_01T21_09_52.023664", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-01T21-09-52.023664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-01T21-09-52.023664.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_01T21_09_52.023664", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-01T21-09-52.023664.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-01T21-09-52.023664.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_01T21_09_52.023664", "path": ["**/details_harness|winogrande|5_2024-02-01T21-09-52.023664.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-01T21-09-52.023664.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_01T21_09_52.023664", "path": ["results_2024-02-01T21-09-52.023664.parquet"]}, {"split": "latest", "path": ["results_2024-02-01T21-09-52.023664.parquet"]}]}]}
2024-02-01T21:12:38+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of cognitivecomputations/TinyDolphin-2.8.2-1.1b-laser Dataset automatically created during the evaluation run of model cognitivecomputations/TinyDolphin-2.8.2-1.1b-laser on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-02-01T21:09:52.023664(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of cognitivecomputations/TinyDolphin-2.8.2-1.1b-laser\n\n\n\nDataset automatically created during the evaluation run of model cognitivecomputations/TinyDolphin-2.8.2-1.1b-laser on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-02-01T21:09:52.023664(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of cognitivecomputations/TinyDolphin-2.8.2-1.1b-laser\n\n\n\nDataset automatically created during the evaluation run of model cognitivecomputations/TinyDolphin-2.8.2-1.1b-laser on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-02-01T21:09:52.023664(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
861e5de72613cb395ef0d6c0d868e3fbb3c157e7
# Dataset Card for Evaluation run of BarryFutureman/WestLakeX-7B-EvoMerge <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [BarryFutureman/WestLakeX-7B-EvoMerge](https://huggingface.co/BarryFutureman/WestLakeX-7B-EvoMerge) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_BarryFutureman__WestLakeX-7B-EvoMerge", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-02-01T21:12:28.457963](https://huggingface.co/datasets/open-llm-leaderboard/details_BarryFutureman__WestLakeX-7B-EvoMerge/blob/main/results_2024-02-01T21-12-28.457963.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6539355408319422, "acc_stderr": 0.03205035581162719, "acc_norm": 0.6534261287046614, "acc_norm_stderr": 0.03272286633720597, "mc1": 0.5263157894736842, "mc1_stderr": 0.017479241161975457, "mc2": 0.674957392013114, "mc2_stderr": 0.014896008898157733 }, "harness|arc:challenge|25": { "acc": 0.6902730375426621, "acc_stderr": 0.013512058415238363, "acc_norm": 0.7141638225255973, "acc_norm_stderr": 0.013203196088537376 }, "harness|hellaswag|10": { "acc": 0.6989643497311293, "acc_stderr": 0.00457770702503138, "acc_norm": 0.8808006373232424, "acc_norm_stderr": 0.0032336074238899773 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.33, "acc_stderr": 0.047258156262526045, "acc_norm": 0.33, "acc_norm_stderr": 0.047258156262526045 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.6444444444444445, "acc_stderr": 0.04135176749720385, "acc_norm": 0.6444444444444445, "acc_norm_stderr": 0.04135176749720385 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.6973684210526315, "acc_stderr": 0.03738520676119669, "acc_norm": 0.6973684210526315, "acc_norm_stderr": 0.03738520676119669 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.64, "acc_stderr": 0.04824181513244218, "acc_norm": 0.64, "acc_norm_stderr": 0.04824181513244218 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.7169811320754716, "acc_stderr": 0.027724236492700914, "acc_norm": 0.7169811320754716, "acc_norm_stderr": 0.027724236492700914 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.7777777777777778, "acc_stderr": 0.03476590104304134, "acc_norm": 0.7777777777777778, "acc_norm_stderr": 0.03476590104304134 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.47, "acc_stderr": 0.050161355804659205, "acc_norm": 0.47, "acc_norm_stderr": 0.050161355804659205 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.52, "acc_stderr": 0.050211673156867795, "acc_norm": 0.52, "acc_norm_stderr": 0.050211673156867795 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.35, "acc_stderr": 0.047937248544110196, "acc_norm": 0.35, "acc_norm_stderr": 0.047937248544110196 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6647398843930635, "acc_stderr": 0.03599586301247077, "acc_norm": 0.6647398843930635, "acc_norm_stderr": 0.03599586301247077 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.4215686274509804, "acc_stderr": 0.04913595201274498, "acc_norm": 0.4215686274509804, "acc_norm_stderr": 0.04913595201274498 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.75, "acc_stderr": 0.04351941398892446, "acc_norm": 0.75, "acc_norm_stderr": 0.04351941398892446 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.5787234042553191, "acc_stderr": 0.03227834510146268, "acc_norm": 0.5787234042553191, "acc_norm_stderr": 0.03227834510146268 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.5087719298245614, "acc_stderr": 0.047028804320496165, "acc_norm": 0.5087719298245614, "acc_norm_stderr": 0.047028804320496165 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5793103448275863, "acc_stderr": 0.0411391498118926, "acc_norm": 0.5793103448275863, "acc_norm_stderr": 0.0411391498118926 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.43915343915343913, "acc_stderr": 0.025559920550531003, "acc_norm": 0.43915343915343913, "acc_norm_stderr": 0.025559920550531003 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.46825396825396826, "acc_stderr": 0.04463112720677171, "acc_norm": 0.46825396825396826, "acc_norm_stderr": 0.04463112720677171 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.32, "acc_stderr": 0.046882617226215034, "acc_norm": 0.32, "acc_norm_stderr": 0.046882617226215034 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.7903225806451613, "acc_stderr": 0.023157879349083522, "acc_norm": 0.7903225806451613, "acc_norm_stderr": 0.023157879349083522 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.4876847290640394, "acc_stderr": 0.035169204442208966, "acc_norm": 0.4876847290640394, "acc_norm_stderr": 0.035169204442208966 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.69, "acc_stderr": 0.04648231987117316, "acc_norm": 0.69, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7696969696969697, "acc_stderr": 0.0328766675860349, "acc_norm": 0.7696969696969697, "acc_norm_stderr": 0.0328766675860349 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.797979797979798, "acc_stderr": 0.028606204289229872, "acc_norm": 0.797979797979798, "acc_norm_stderr": 0.028606204289229872 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.8860103626943006, "acc_stderr": 0.022935144053919443, "acc_norm": 0.8860103626943006, "acc_norm_stderr": 0.022935144053919443 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.6512820512820513, "acc_stderr": 0.02416278028401772, "acc_norm": 0.6512820512820513, "acc_norm_stderr": 0.02416278028401772 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.37037037037037035, "acc_stderr": 0.029443169323031537, "acc_norm": 0.37037037037037035, "acc_norm_stderr": 0.029443169323031537 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.680672268907563, "acc_stderr": 0.0302839955258844, "acc_norm": 0.680672268907563, "acc_norm_stderr": 0.0302839955258844 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.3576158940397351, "acc_stderr": 0.03913453431177258, "acc_norm": 0.3576158940397351, "acc_norm_stderr": 0.03913453431177258 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8458715596330275, "acc_stderr": 0.015480826865374307, "acc_norm": 0.8458715596330275, "acc_norm_stderr": 0.015480826865374307 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.5092592592592593, "acc_stderr": 0.034093869469927006, "acc_norm": 0.5092592592592593, "acc_norm_stderr": 0.034093869469927006 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.8333333333333334, "acc_stderr": 0.026156867523931038, "acc_norm": 0.8333333333333334, "acc_norm_stderr": 0.026156867523931038 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.8143459915611815, "acc_stderr": 0.025310495376944856, "acc_norm": 0.8143459915611815, "acc_norm_stderr": 0.025310495376944856 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6771300448430493, "acc_stderr": 0.031381476375754995, "acc_norm": 0.6771300448430493, "acc_norm_stderr": 0.031381476375754995 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.7862595419847328, "acc_stderr": 0.0359546161177469, "acc_norm": 0.7862595419847328, "acc_norm_stderr": 0.0359546161177469 }, "harness|hendrycksTest-international_law|5": { "acc": 0.7933884297520661, "acc_stderr": 0.03695980128098824, "acc_norm": 0.7933884297520661, "acc_norm_stderr": 0.03695980128098824 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7870370370370371, "acc_stderr": 0.0395783547198098, "acc_norm": 0.7870370370370371, "acc_norm_stderr": 0.0395783547198098 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7668711656441718, "acc_stderr": 0.0332201579577674, "acc_norm": 0.7668711656441718, "acc_norm_stderr": 0.0332201579577674 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.4375, "acc_stderr": 0.04708567521880525, "acc_norm": 0.4375, "acc_norm_stderr": 0.04708567521880525 }, "harness|hendrycksTest-management|5": { "acc": 0.7669902912621359, "acc_stderr": 0.04185832598928315, "acc_norm": 0.7669902912621359, "acc_norm_stderr": 0.04185832598928315 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8846153846153846, "acc_stderr": 0.02093019318517933, "acc_norm": 0.8846153846153846, "acc_norm_stderr": 0.02093019318517933 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.75, "acc_stderr": 0.04351941398892446, "acc_norm": 0.75, "acc_norm_stderr": 0.04351941398892446 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8212005108556832, "acc_stderr": 0.013702643715368983, "acc_norm": 0.8212005108556832, "acc_norm_stderr": 0.013702643715368983 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.7398843930635838, "acc_stderr": 0.023618678310069356, "acc_norm": 0.7398843930635838, "acc_norm_stderr": 0.023618678310069356 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.4201117318435754, "acc_stderr": 0.016507671073256402, "acc_norm": 0.4201117318435754, "acc_norm_stderr": 0.016507671073256402 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.7156862745098039, "acc_stderr": 0.025829163272757482, "acc_norm": 0.7156862745098039, "acc_norm_stderr": 0.025829163272757482 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.7234726688102894, "acc_stderr": 0.02540383297817961, "acc_norm": 0.7234726688102894, "acc_norm_stderr": 0.02540383297817961 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.7345679012345679, "acc_stderr": 0.024569223600460845, "acc_norm": 0.7345679012345679, "acc_norm_stderr": 0.024569223600460845 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.48226950354609927, "acc_stderr": 0.02980873964223777, "acc_norm": 0.48226950354609927, "acc_norm_stderr": 0.02980873964223777 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.4667535853976532, "acc_stderr": 0.012741974333897226, "acc_norm": 0.4667535853976532, "acc_norm_stderr": 0.012741974333897226 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.6654411764705882, "acc_stderr": 0.028661996202335303, "acc_norm": 0.6654411764705882, "acc_norm_stderr": 0.028661996202335303 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6715686274509803, "acc_stderr": 0.018999707383162673, "acc_norm": 0.6715686274509803, "acc_norm_stderr": 0.018999707383162673 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6818181818181818, "acc_stderr": 0.044612721759105085, "acc_norm": 0.6818181818181818, "acc_norm_stderr": 0.044612721759105085 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.7387755102040816, "acc_stderr": 0.028123429335142783, "acc_norm": 0.7387755102040816, "acc_norm_stderr": 0.028123429335142783 }, "harness|hendrycksTest-sociology|5": { "acc": 0.8258706467661692, "acc_stderr": 0.026814951200421606, "acc_norm": 0.8258706467661692, "acc_norm_stderr": 0.026814951200421606 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.88, "acc_stderr": 0.03265986323710906, "acc_norm": 0.88, "acc_norm_stderr": 0.03265986323710906 }, "harness|hendrycksTest-virology|5": { "acc": 0.5421686746987951, "acc_stderr": 0.0387862677100236, "acc_norm": 0.5421686746987951, "acc_norm_stderr": 0.0387862677100236 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8362573099415205, "acc_stderr": 0.028380919596145866, "acc_norm": 0.8362573099415205, "acc_norm_stderr": 0.028380919596145866 }, "harness|truthfulqa:mc|0": { "mc1": 0.5263157894736842, "mc1_stderr": 0.017479241161975457, "mc2": 0.674957392013114, "mc2_stderr": 0.014896008898157733 }, "harness|winogrande|5": { "acc": 0.8476716653512234, "acc_stderr": 0.010099208246065597 }, "harness|gsm8k|5": { "acc": 0.6959818043972706, "acc_stderr": 0.01267042044019867 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_BarryFutureman__WestLakeX-7B-EvoMerge
[ "region:us" ]
2024-02-01T21:14:50+00:00
{"pretty_name": "Evaluation run of BarryFutureman/WestLakeX-7B-EvoMerge", "dataset_summary": "Dataset automatically created during the evaluation run of model [BarryFutureman/WestLakeX-7B-EvoMerge](https://huggingface.co/BarryFutureman/WestLakeX-7B-EvoMerge) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_BarryFutureman__WestLakeX-7B-EvoMerge\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-01T21:12:28.457963](https://huggingface.co/datasets/open-llm-leaderboard/details_BarryFutureman__WestLakeX-7B-EvoMerge/blob/main/results_2024-02-01T21-12-28.457963.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6539355408319422,\n \"acc_stderr\": 0.03205035581162719,\n \"acc_norm\": 0.6534261287046614,\n \"acc_norm_stderr\": 0.03272286633720597,\n \"mc1\": 0.5263157894736842,\n \"mc1_stderr\": 0.017479241161975457,\n \"mc2\": 0.674957392013114,\n \"mc2_stderr\": 0.014896008898157733\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6902730375426621,\n \"acc_stderr\": 0.013512058415238363,\n \"acc_norm\": 0.7141638225255973,\n \"acc_norm_stderr\": 0.013203196088537376\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6989643497311293,\n \"acc_stderr\": 0.00457770702503138,\n \"acc_norm\": 0.8808006373232424,\n \"acc_norm_stderr\": 0.0032336074238899773\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6444444444444445,\n \"acc_stderr\": 0.04135176749720385,\n \"acc_norm\": 0.6444444444444445,\n \"acc_norm_stderr\": 0.04135176749720385\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6973684210526315,\n \"acc_stderr\": 0.03738520676119669,\n \"acc_norm\": 0.6973684210526315,\n \"acc_norm_stderr\": 0.03738520676119669\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7169811320754716,\n \"acc_stderr\": 0.027724236492700914,\n \"acc_norm\": 0.7169811320754716,\n \"acc_norm_stderr\": 0.027724236492700914\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.03476590104304134,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.03476590104304134\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6647398843930635,\n \"acc_stderr\": 0.03599586301247077,\n \"acc_norm\": 0.6647398843930635,\n \"acc_norm_stderr\": 0.03599586301247077\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4215686274509804,\n \"acc_stderr\": 0.04913595201274498,\n \"acc_norm\": 0.4215686274509804,\n \"acc_norm_stderr\": 0.04913595201274498\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5787234042553191,\n \"acc_stderr\": 0.03227834510146268,\n \"acc_norm\": 0.5787234042553191,\n \"acc_norm_stderr\": 0.03227834510146268\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5087719298245614,\n \"acc_stderr\": 0.047028804320496165,\n \"acc_norm\": 0.5087719298245614,\n \"acc_norm_stderr\": 0.047028804320496165\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5793103448275863,\n \"acc_stderr\": 0.0411391498118926,\n \"acc_norm\": 0.5793103448275863,\n \"acc_norm_stderr\": 0.0411391498118926\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.43915343915343913,\n \"acc_stderr\": 0.025559920550531003,\n \"acc_norm\": 0.43915343915343913,\n \"acc_norm_stderr\": 0.025559920550531003\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.46825396825396826,\n \"acc_stderr\": 0.04463112720677171,\n \"acc_norm\": 0.46825396825396826,\n \"acc_norm_stderr\": 0.04463112720677171\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7903225806451613,\n \"acc_stderr\": 0.023157879349083522,\n \"acc_norm\": 0.7903225806451613,\n \"acc_norm_stderr\": 0.023157879349083522\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4876847290640394,\n \"acc_stderr\": 0.035169204442208966,\n \"acc_norm\": 0.4876847290640394,\n \"acc_norm_stderr\": 0.035169204442208966\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.0328766675860349,\n \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.0328766675860349\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.797979797979798,\n \"acc_stderr\": 0.028606204289229872,\n \"acc_norm\": 0.797979797979798,\n \"acc_norm_stderr\": 0.028606204289229872\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8860103626943006,\n \"acc_stderr\": 0.022935144053919443,\n \"acc_norm\": 0.8860103626943006,\n \"acc_norm_stderr\": 0.022935144053919443\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6512820512820513,\n \"acc_stderr\": 0.02416278028401772,\n \"acc_norm\": 0.6512820512820513,\n \"acc_norm_stderr\": 0.02416278028401772\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.37037037037037035,\n \"acc_stderr\": 0.029443169323031537,\n \"acc_norm\": 0.37037037037037035,\n \"acc_norm_stderr\": 0.029443169323031537\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.680672268907563,\n \"acc_stderr\": 0.0302839955258844,\n \"acc_norm\": 0.680672268907563,\n \"acc_norm_stderr\": 0.0302839955258844\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3576158940397351,\n \"acc_stderr\": 0.03913453431177258,\n \"acc_norm\": 0.3576158940397351,\n \"acc_norm_stderr\": 0.03913453431177258\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8458715596330275,\n \"acc_stderr\": 0.015480826865374307,\n \"acc_norm\": 0.8458715596330275,\n \"acc_norm_stderr\": 0.015480826865374307\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5092592592592593,\n \"acc_stderr\": 0.034093869469927006,\n \"acc_norm\": 0.5092592592592593,\n \"acc_norm_stderr\": 0.034093869469927006\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8333333333333334,\n \"acc_stderr\": 0.026156867523931038,\n \"acc_norm\": 0.8333333333333334,\n \"acc_norm_stderr\": 0.026156867523931038\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8143459915611815,\n \"acc_stderr\": 0.025310495376944856,\n \"acc_norm\": 0.8143459915611815,\n \"acc_norm_stderr\": 0.025310495376944856\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6771300448430493,\n \"acc_stderr\": 0.031381476375754995,\n \"acc_norm\": 0.6771300448430493,\n \"acc_norm_stderr\": 0.031381476375754995\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7862595419847328,\n \"acc_stderr\": 0.0359546161177469,\n \"acc_norm\": 0.7862595419847328,\n \"acc_norm_stderr\": 0.0359546161177469\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7933884297520661,\n \"acc_stderr\": 0.03695980128098824,\n \"acc_norm\": 0.7933884297520661,\n \"acc_norm_stderr\": 0.03695980128098824\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n \"acc_stderr\": 0.0395783547198098,\n \"acc_norm\": 0.7870370370370371,\n \"acc_norm_stderr\": 0.0395783547198098\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7668711656441718,\n \"acc_stderr\": 0.0332201579577674,\n \"acc_norm\": 0.7668711656441718,\n \"acc_norm_stderr\": 0.0332201579577674\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4375,\n \"acc_stderr\": 0.04708567521880525,\n \"acc_norm\": 0.4375,\n \"acc_norm_stderr\": 0.04708567521880525\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8846153846153846,\n \"acc_stderr\": 0.02093019318517933,\n \"acc_norm\": 0.8846153846153846,\n \"acc_norm_stderr\": 0.02093019318517933\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8212005108556832,\n \"acc_stderr\": 0.013702643715368983,\n \"acc_norm\": 0.8212005108556832,\n \"acc_norm_stderr\": 0.013702643715368983\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7398843930635838,\n \"acc_stderr\": 0.023618678310069356,\n \"acc_norm\": 0.7398843930635838,\n \"acc_norm_stderr\": 0.023618678310069356\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4201117318435754,\n \"acc_stderr\": 0.016507671073256402,\n \"acc_norm\": 0.4201117318435754,\n \"acc_norm_stderr\": 0.016507671073256402\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7156862745098039,\n \"acc_stderr\": 0.025829163272757482,\n \"acc_norm\": 0.7156862745098039,\n \"acc_norm_stderr\": 0.025829163272757482\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7234726688102894,\n \"acc_stderr\": 0.02540383297817961,\n \"acc_norm\": 0.7234726688102894,\n \"acc_norm_stderr\": 0.02540383297817961\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7345679012345679,\n \"acc_stderr\": 0.024569223600460845,\n \"acc_norm\": 0.7345679012345679,\n \"acc_norm_stderr\": 0.024569223600460845\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.48226950354609927,\n \"acc_stderr\": 0.02980873964223777,\n \"acc_norm\": 0.48226950354609927,\n \"acc_norm_stderr\": 0.02980873964223777\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4667535853976532,\n \"acc_stderr\": 0.012741974333897226,\n \"acc_norm\": 0.4667535853976532,\n \"acc_norm_stderr\": 0.012741974333897226\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6654411764705882,\n \"acc_stderr\": 0.028661996202335303,\n \"acc_norm\": 0.6654411764705882,\n \"acc_norm_stderr\": 0.028661996202335303\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6715686274509803,\n \"acc_stderr\": 0.018999707383162673,\n \"acc_norm\": 0.6715686274509803,\n \"acc_norm_stderr\": 0.018999707383162673\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n \"acc_stderr\": 0.044612721759105085,\n \"acc_norm\": 0.6818181818181818,\n \"acc_norm_stderr\": 0.044612721759105085\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7387755102040816,\n \"acc_stderr\": 0.028123429335142783,\n \"acc_norm\": 0.7387755102040816,\n \"acc_norm_stderr\": 0.028123429335142783\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8258706467661692,\n \"acc_stderr\": 0.026814951200421606,\n \"acc_norm\": 0.8258706467661692,\n \"acc_norm_stderr\": 0.026814951200421606\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.88,\n \"acc_stderr\": 0.03265986323710906,\n \"acc_norm\": 0.88,\n \"acc_norm_stderr\": 0.03265986323710906\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5421686746987951,\n \"acc_stderr\": 0.0387862677100236,\n \"acc_norm\": 0.5421686746987951,\n \"acc_norm_stderr\": 0.0387862677100236\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5263157894736842,\n \"mc1_stderr\": 0.017479241161975457,\n \"mc2\": 0.674957392013114,\n \"mc2_stderr\": 0.014896008898157733\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8476716653512234,\n \"acc_stderr\": 0.010099208246065597\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6959818043972706,\n \"acc_stderr\": 0.01267042044019867\n }\n}\n```", "repo_url": "https://huggingface.co/BarryFutureman/WestLakeX-7B-EvoMerge", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_01T21_12_28.457963", "path": ["**/details_harness|arc:challenge|25_2024-02-01T21-12-28.457963.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-01T21-12-28.457963.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_01T21_12_28.457963", "path": ["**/details_harness|gsm8k|5_2024-02-01T21-12-28.457963.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-01T21-12-28.457963.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_01T21_12_28.457963", "path": ["**/details_harness|hellaswag|10_2024-02-01T21-12-28.457963.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-01T21-12-28.457963.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_01T21_12_28.457963", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T21-12-28.457963.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-01T21-12-28.457963.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-01T21-12-28.457963.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T21-12-28.457963.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T21-12-28.457963.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-01T21-12-28.457963.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T21-12-28.457963.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T21-12-28.457963.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T21-12-28.457963.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T21-12-28.457963.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-01T21-12-28.457963.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-01T21-12-28.457963.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T21-12-28.457963.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-01T21-12-28.457963.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T21-12-28.457963.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T21-12-28.457963.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T21-12-28.457963.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-01T21-12-28.457963.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T21-12-28.457963.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T21-12-28.457963.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T21-12-28.457963.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T21-12-28.457963.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T21-12-28.457963.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T21-12-28.457963.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T21-12-28.457963.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T21-12-28.457963.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T21-12-28.457963.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T21-12-28.457963.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T21-12-28.457963.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T21-12-28.457963.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T21-12-28.457963.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T21-12-28.457963.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-01T21-12-28.457963.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T21-12-28.457963.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-01T21-12-28.457963.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T21-12-28.457963.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T21-12-28.457963.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T21-12-28.457963.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-01T21-12-28.457963.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-01T21-12-28.457963.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T21-12-28.457963.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T21-12-28.457963.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T21-12-28.457963.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T21-12-28.457963.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-01T21-12-28.457963.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-01T21-12-28.457963.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-01T21-12-28.457963.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T21-12-28.457963.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-01T21-12-28.457963.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T21-12-28.457963.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T21-12-28.457963.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-01T21-12-28.457963.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-01T21-12-28.457963.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-01T21-12-28.457963.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T21-12-28.457963.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-01T21-12-28.457963.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-01T21-12-28.457963.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T21-12-28.457963.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-01T21-12-28.457963.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-01T21-12-28.457963.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T21-12-28.457963.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T21-12-28.457963.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-01T21-12-28.457963.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T21-12-28.457963.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T21-12-28.457963.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T21-12-28.457963.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T21-12-28.457963.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-01T21-12-28.457963.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-01T21-12-28.457963.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T21-12-28.457963.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-01T21-12-28.457963.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T21-12-28.457963.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T21-12-28.457963.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T21-12-28.457963.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-01T21-12-28.457963.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T21-12-28.457963.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T21-12-28.457963.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T21-12-28.457963.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T21-12-28.457963.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T21-12-28.457963.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T21-12-28.457963.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T21-12-28.457963.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T21-12-28.457963.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T21-12-28.457963.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T21-12-28.457963.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T21-12-28.457963.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T21-12-28.457963.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T21-12-28.457963.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T21-12-28.457963.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-01T21-12-28.457963.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T21-12-28.457963.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-01T21-12-28.457963.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T21-12-28.457963.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T21-12-28.457963.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T21-12-28.457963.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-01T21-12-28.457963.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-01T21-12-28.457963.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T21-12-28.457963.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T21-12-28.457963.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T21-12-28.457963.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T21-12-28.457963.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-01T21-12-28.457963.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-01T21-12-28.457963.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-01T21-12-28.457963.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T21-12-28.457963.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-01T21-12-28.457963.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T21-12-28.457963.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T21-12-28.457963.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-01T21-12-28.457963.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-01T21-12-28.457963.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-01T21-12-28.457963.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T21-12-28.457963.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-01T21-12-28.457963.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-01T21-12-28.457963.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_01T21_12_28.457963", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T21-12-28.457963.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T21-12-28.457963.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_01T21_12_28.457963", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-01T21-12-28.457963.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-01T21-12-28.457963.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_01T21_12_28.457963", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-01T21-12-28.457963.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-01T21-12-28.457963.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_01T21_12_28.457963", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T21-12-28.457963.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T21-12-28.457963.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_01T21_12_28.457963", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T21-12-28.457963.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T21-12-28.457963.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_01T21_12_28.457963", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-01T21-12-28.457963.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-01T21-12-28.457963.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_01T21_12_28.457963", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T21-12-28.457963.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T21-12-28.457963.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_01T21_12_28.457963", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T21-12-28.457963.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T21-12-28.457963.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_01T21_12_28.457963", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T21-12-28.457963.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T21-12-28.457963.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_01T21_12_28.457963", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T21-12-28.457963.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T21-12-28.457963.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_01T21_12_28.457963", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-01T21-12-28.457963.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-01T21-12-28.457963.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_01T21_12_28.457963", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-01T21-12-28.457963.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-01T21-12-28.457963.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_01T21_12_28.457963", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T21-12-28.457963.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T21-12-28.457963.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_01T21_12_28.457963", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-01T21-12-28.457963.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-01T21-12-28.457963.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_01T21_12_28.457963", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T21-12-28.457963.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T21-12-28.457963.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_01T21_12_28.457963", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T21-12-28.457963.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T21-12-28.457963.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_01T21_12_28.457963", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T21-12-28.457963.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T21-12-28.457963.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_01T21_12_28.457963", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-01T21-12-28.457963.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-01T21-12-28.457963.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_01T21_12_28.457963", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T21-12-28.457963.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T21-12-28.457963.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_01T21_12_28.457963", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T21-12-28.457963.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T21-12-28.457963.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_01T21_12_28.457963", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T21-12-28.457963.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T21-12-28.457963.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_01T21_12_28.457963", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T21-12-28.457963.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T21-12-28.457963.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_01T21_12_28.457963", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T21-12-28.457963.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T21-12-28.457963.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_01T21_12_28.457963", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T21-12-28.457963.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T21-12-28.457963.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_01T21_12_28.457963", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T21-12-28.457963.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T21-12-28.457963.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_01T21_12_28.457963", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T21-12-28.457963.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T21-12-28.457963.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_01T21_12_28.457963", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T21-12-28.457963.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T21-12-28.457963.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_01T21_12_28.457963", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T21-12-28.457963.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T21-12-28.457963.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_01T21_12_28.457963", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T21-12-28.457963.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T21-12-28.457963.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_01T21_12_28.457963", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T21-12-28.457963.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T21-12-28.457963.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_01T21_12_28.457963", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T21-12-28.457963.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T21-12-28.457963.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_01T21_12_28.457963", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T21-12-28.457963.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T21-12-28.457963.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_01T21_12_28.457963", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-01T21-12-28.457963.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-01T21-12-28.457963.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_01T21_12_28.457963", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T21-12-28.457963.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T21-12-28.457963.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_01T21_12_28.457963", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-01T21-12-28.457963.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-01T21-12-28.457963.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_01T21_12_28.457963", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T21-12-28.457963.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T21-12-28.457963.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_01T21_12_28.457963", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T21-12-28.457963.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T21-12-28.457963.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_01T21_12_28.457963", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T21-12-28.457963.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T21-12-28.457963.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_01T21_12_28.457963", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-01T21-12-28.457963.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-01T21-12-28.457963.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_01T21_12_28.457963", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-01T21-12-28.457963.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-01T21-12-28.457963.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_01T21_12_28.457963", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T21-12-28.457963.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T21-12-28.457963.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_01T21_12_28.457963", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T21-12-28.457963.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T21-12-28.457963.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_01T21_12_28.457963", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T21-12-28.457963.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T21-12-28.457963.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_01T21_12_28.457963", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T21-12-28.457963.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T21-12-28.457963.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_01T21_12_28.457963", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-01T21-12-28.457963.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-01T21-12-28.457963.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_01T21_12_28.457963", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-01T21-12-28.457963.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-01T21-12-28.457963.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_01T21_12_28.457963", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-01T21-12-28.457963.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-01T21-12-28.457963.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_01T21_12_28.457963", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T21-12-28.457963.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T21-12-28.457963.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_01T21_12_28.457963", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-01T21-12-28.457963.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-01T21-12-28.457963.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_01T21_12_28.457963", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T21-12-28.457963.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T21-12-28.457963.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_01T21_12_28.457963", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T21-12-28.457963.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T21-12-28.457963.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_01T21_12_28.457963", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-01T21-12-28.457963.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-01T21-12-28.457963.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_01T21_12_28.457963", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-01T21-12-28.457963.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-01T21-12-28.457963.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_01T21_12_28.457963", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-01T21-12-28.457963.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-01T21-12-28.457963.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_01T21_12_28.457963", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T21-12-28.457963.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T21-12-28.457963.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_01T21_12_28.457963", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-01T21-12-28.457963.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-01T21-12-28.457963.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_01T21_12_28.457963", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-01T21-12-28.457963.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-01T21-12-28.457963.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_01T21_12_28.457963", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-01T21-12-28.457963.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-01T21-12-28.457963.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_01T21_12_28.457963", "path": ["**/details_harness|winogrande|5_2024-02-01T21-12-28.457963.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-01T21-12-28.457963.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_01T21_12_28.457963", "path": ["results_2024-02-01T21-12-28.457963.parquet"]}, {"split": "latest", "path": ["results_2024-02-01T21-12-28.457963.parquet"]}]}]}
2024-02-01T21:15:16+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of BarryFutureman/WestLakeX-7B-EvoMerge Dataset automatically created during the evaluation run of model BarryFutureman/WestLakeX-7B-EvoMerge on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-02-01T21:12:28.457963(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of BarryFutureman/WestLakeX-7B-EvoMerge\n\n\n\nDataset automatically created during the evaluation run of model BarryFutureman/WestLakeX-7B-EvoMerge on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-02-01T21:12:28.457963(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of BarryFutureman/WestLakeX-7B-EvoMerge\n\n\n\nDataset automatically created during the evaluation run of model BarryFutureman/WestLakeX-7B-EvoMerge on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-02-01T21:12:28.457963(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
b63eefc7b12cf9833653d3ad4f39037d7a719651
# Dataset Card for Evaluation run of Gille/StrangeMerges_12-7B-slerp <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [Gille/StrangeMerges_12-7B-slerp](https://huggingface.co/Gille/StrangeMerges_12-7B-slerp) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_Gille__StrangeMerges_12-7B-slerp", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-02-02T02:19:18.956358](https://huggingface.co/datasets/open-llm-leaderboard/details_Gille__StrangeMerges_12-7B-slerp/blob/main/results_2024-02-02T02-19-18.956358.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6527159056267988, "acc_stderr": 0.032047213513704105, "acc_norm": 0.6543707186687568, "acc_norm_stderr": 0.032695751385077014, "mc1": 0.35862913096695226, "mc1_stderr": 0.016789289499502025, "mc2": 0.5255271961828023, "mc2_stderr": 0.014972145811572106 }, "harness|arc:challenge|25": { "acc": 0.6254266211604096, "acc_stderr": 0.014144193471893447, "acc_norm": 0.6663822525597269, "acc_norm_stderr": 0.013778687054176536 }, "harness|hellaswag|10": { "acc": 0.6662019518024298, "acc_stderr": 0.0047060481167649415, "acc_norm": 0.8589922326229835, "acc_norm_stderr": 0.003473182890968969 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.38, "acc_stderr": 0.04878317312145632, "acc_norm": 0.38, "acc_norm_stderr": 0.04878317312145632 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.6296296296296297, "acc_stderr": 0.041716541613545426, "acc_norm": 0.6296296296296297, "acc_norm_stderr": 0.041716541613545426 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.7039473684210527, "acc_stderr": 0.03715062154998904, "acc_norm": 0.7039473684210527, "acc_norm_stderr": 0.03715062154998904 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.62, "acc_stderr": 0.048783173121456316, "acc_norm": 0.62, "acc_norm_stderr": 0.048783173121456316 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.7169811320754716, "acc_stderr": 0.027724236492700918, "acc_norm": 0.7169811320754716, "acc_norm_stderr": 0.027724236492700918 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.7569444444444444, "acc_stderr": 0.035868792800803406, "acc_norm": 0.7569444444444444, "acc_norm_stderr": 0.035868792800803406 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.5, "acc_stderr": 0.050251890762960605, "acc_norm": 0.5, "acc_norm_stderr": 0.050251890762960605 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.53, "acc_stderr": 0.05016135580465919, "acc_norm": 0.53, "acc_norm_stderr": 0.05016135580465919 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.29, "acc_stderr": 0.04560480215720684, "acc_norm": 0.29, "acc_norm_stderr": 0.04560480215720684 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6705202312138728, "acc_stderr": 0.03583901754736412, "acc_norm": 0.6705202312138728, "acc_norm_stderr": 0.03583901754736412 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.4019607843137255, "acc_stderr": 0.04878608714466996, "acc_norm": 0.4019607843137255, "acc_norm_stderr": 0.04878608714466996 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.76, "acc_stderr": 0.042923469599092816, "acc_norm": 0.76, "acc_norm_stderr": 0.042923469599092816 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.5872340425531914, "acc_stderr": 0.03218471141400351, "acc_norm": 0.5872340425531914, "acc_norm_stderr": 0.03218471141400351 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.49122807017543857, "acc_stderr": 0.04702880432049615, "acc_norm": 0.49122807017543857, "acc_norm_stderr": 0.04702880432049615 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5586206896551724, "acc_stderr": 0.04137931034482757, "acc_norm": 0.5586206896551724, "acc_norm_stderr": 0.04137931034482757 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.42857142857142855, "acc_stderr": 0.02548718714785938, "acc_norm": 0.42857142857142855, "acc_norm_stderr": 0.02548718714785938 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.4365079365079365, "acc_stderr": 0.04435932892851466, "acc_norm": 0.4365079365079365, "acc_norm_stderr": 0.04435932892851466 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.33, "acc_stderr": 0.047258156262526045, "acc_norm": 0.33, "acc_norm_stderr": 0.047258156262526045 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.7774193548387097, "acc_stderr": 0.023664216671642518, "acc_norm": 0.7774193548387097, "acc_norm_stderr": 0.023664216671642518 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.5073891625615764, "acc_stderr": 0.035176035403610105, "acc_norm": 0.5073891625615764, "acc_norm_stderr": 0.035176035403610105 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.7, "acc_stderr": 0.046056618647183814, "acc_norm": 0.7, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7575757575757576, "acc_stderr": 0.03346409881055953, "acc_norm": 0.7575757575757576, "acc_norm_stderr": 0.03346409881055953 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.7929292929292929, "acc_stderr": 0.028869778460267045, "acc_norm": 0.7929292929292929, "acc_norm_stderr": 0.028869778460267045 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.8963730569948186, "acc_stderr": 0.02199531196364424, "acc_norm": 0.8963730569948186, "acc_norm_stderr": 0.02199531196364424 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.6743589743589744, "acc_stderr": 0.02375966576741229, "acc_norm": 0.6743589743589744, "acc_norm_stderr": 0.02375966576741229 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.3814814814814815, "acc_stderr": 0.029616718927497586, "acc_norm": 0.3814814814814815, "acc_norm_stderr": 0.029616718927497586 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.680672268907563, "acc_stderr": 0.030283995525884396, "acc_norm": 0.680672268907563, "acc_norm_stderr": 0.030283995525884396 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.3576158940397351, "acc_stderr": 0.03913453431177258, "acc_norm": 0.3576158940397351, "acc_norm_stderr": 0.03913453431177258 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8366972477064221, "acc_stderr": 0.01584825580650155, "acc_norm": 0.8366972477064221, "acc_norm_stderr": 0.01584825580650155 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.5462962962962963, "acc_stderr": 0.033953227263757976, "acc_norm": 0.5462962962962963, "acc_norm_stderr": 0.033953227263757976 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.8186274509803921, "acc_stderr": 0.027044621719474082, "acc_norm": 0.8186274509803921, "acc_norm_stderr": 0.027044621719474082 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.7721518987341772, "acc_stderr": 0.027303484599069436, "acc_norm": 0.7721518987341772, "acc_norm_stderr": 0.027303484599069436 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6860986547085202, "acc_stderr": 0.031146796482972465, "acc_norm": 0.6860986547085202, "acc_norm_stderr": 0.031146796482972465 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.8091603053435115, "acc_stderr": 0.03446513350752599, "acc_norm": 0.8091603053435115, "acc_norm_stderr": 0.03446513350752599 }, "harness|hendrycksTest-international_law|5": { "acc": 0.8016528925619835, "acc_stderr": 0.03640118271990947, "acc_norm": 0.8016528925619835, "acc_norm_stderr": 0.03640118271990947 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7870370370370371, "acc_stderr": 0.0395783547198098, "acc_norm": 0.7870370370370371, "acc_norm_stderr": 0.0395783547198098 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7668711656441718, "acc_stderr": 0.0332201579577674, "acc_norm": 0.7668711656441718, "acc_norm_stderr": 0.0332201579577674 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.5178571428571429, "acc_stderr": 0.047427623612430116, "acc_norm": 0.5178571428571429, "acc_norm_stderr": 0.047427623612430116 }, "harness|hendrycksTest-management|5": { "acc": 0.7864077669902912, "acc_stderr": 0.04058042015646034, "acc_norm": 0.7864077669902912, "acc_norm_stderr": 0.04058042015646034 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8717948717948718, "acc_stderr": 0.02190190511507333, "acc_norm": 0.8717948717948718, "acc_norm_stderr": 0.02190190511507333 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.73, "acc_stderr": 0.0446196043338474, "acc_norm": 0.73, "acc_norm_stderr": 0.0446196043338474 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8263090676883781, "acc_stderr": 0.01354741565866226, "acc_norm": 0.8263090676883781, "acc_norm_stderr": 0.01354741565866226 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.7341040462427746, "acc_stderr": 0.023786203255508283, "acc_norm": 0.7341040462427746, "acc_norm_stderr": 0.023786203255508283 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.3854748603351955, "acc_stderr": 0.016277927039638193, "acc_norm": 0.3854748603351955, "acc_norm_stderr": 0.016277927039638193 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.7581699346405228, "acc_stderr": 0.024518195641879334, "acc_norm": 0.7581699346405228, "acc_norm_stderr": 0.024518195641879334 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.7170418006430869, "acc_stderr": 0.02558306248998481, "acc_norm": 0.7170418006430869, "acc_norm_stderr": 0.02558306248998481 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.7623456790123457, "acc_stderr": 0.023683591837008557, "acc_norm": 0.7623456790123457, "acc_norm_stderr": 0.023683591837008557 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.5035460992907801, "acc_stderr": 0.02982674915328092, "acc_norm": 0.5035460992907801, "acc_norm_stderr": 0.02982674915328092 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.4654498044328553, "acc_stderr": 0.0127397115540457, "acc_norm": 0.4654498044328553, "acc_norm_stderr": 0.0127397115540457 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.6801470588235294, "acc_stderr": 0.028332959514031208, "acc_norm": 0.6801470588235294, "acc_norm_stderr": 0.028332959514031208 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6977124183006536, "acc_stderr": 0.018579232711113877, "acc_norm": 0.6977124183006536, "acc_norm_stderr": 0.018579232711113877 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6545454545454545, "acc_stderr": 0.04554619617541054, "acc_norm": 0.6545454545454545, "acc_norm_stderr": 0.04554619617541054 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.746938775510204, "acc_stderr": 0.02783302387139968, "acc_norm": 0.746938775510204, "acc_norm_stderr": 0.02783302387139968 }, "harness|hendrycksTest-sociology|5": { "acc": 0.8557213930348259, "acc_stderr": 0.024845753212306046, "acc_norm": 0.8557213930348259, "acc_norm_stderr": 0.024845753212306046 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.88, "acc_stderr": 0.03265986323710906, "acc_norm": 0.88, "acc_norm_stderr": 0.03265986323710906 }, "harness|hendrycksTest-virology|5": { "acc": 0.5301204819277109, "acc_stderr": 0.03885425420866767, "acc_norm": 0.5301204819277109, "acc_norm_stderr": 0.03885425420866767 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8362573099415205, "acc_stderr": 0.028380919596145866, "acc_norm": 0.8362573099415205, "acc_norm_stderr": 0.028380919596145866 }, "harness|truthfulqa:mc|0": { "mc1": 0.35862913096695226, "mc1_stderr": 0.016789289499502025, "mc2": 0.5255271961828023, "mc2_stderr": 0.014972145811572106 }, "harness|winogrande|5": { "acc": 0.8153117600631413, "acc_stderr": 0.01090597811215688 }, "harness|gsm8k|5": { "acc": 0.6262319939347991, "acc_stderr": 0.013326342860737006 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_Gille__StrangeMerges_12-7B-slerp
[ "region:us" ]
2024-02-01T21:20:09+00:00
{"pretty_name": "Evaluation run of Gille/StrangeMerges_12-7B-slerp", "dataset_summary": "Dataset automatically created during the evaluation run of model [Gille/StrangeMerges_12-7B-slerp](https://huggingface.co/Gille/StrangeMerges_12-7B-slerp) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Gille__StrangeMerges_12-7B-slerp\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-02T02:19:18.956358](https://huggingface.co/datasets/open-llm-leaderboard/details_Gille__StrangeMerges_12-7B-slerp/blob/main/results_2024-02-02T02-19-18.956358.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6527159056267988,\n \"acc_stderr\": 0.032047213513704105,\n \"acc_norm\": 0.6543707186687568,\n \"acc_norm_stderr\": 0.032695751385077014,\n \"mc1\": 0.35862913096695226,\n \"mc1_stderr\": 0.016789289499502025,\n \"mc2\": 0.5255271961828023,\n \"mc2_stderr\": 0.014972145811572106\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6254266211604096,\n \"acc_stderr\": 0.014144193471893447,\n \"acc_norm\": 0.6663822525597269,\n \"acc_norm_stderr\": 0.013778687054176536\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6662019518024298,\n \"acc_stderr\": 0.0047060481167649415,\n \"acc_norm\": 0.8589922326229835,\n \"acc_norm_stderr\": 0.003473182890968969\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145632,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145632\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6296296296296297,\n \"acc_stderr\": 0.041716541613545426,\n \"acc_norm\": 0.6296296296296297,\n \"acc_norm_stderr\": 0.041716541613545426\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7039473684210527,\n \"acc_stderr\": 0.03715062154998904,\n \"acc_norm\": 0.7039473684210527,\n \"acc_norm_stderr\": 0.03715062154998904\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.62,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.62,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7169811320754716,\n \"acc_stderr\": 0.027724236492700918,\n \"acc_norm\": 0.7169811320754716,\n \"acc_norm_stderr\": 0.027724236492700918\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7569444444444444,\n \"acc_stderr\": 0.035868792800803406,\n \"acc_norm\": 0.7569444444444444,\n \"acc_norm_stderr\": 0.035868792800803406\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6705202312138728,\n \"acc_stderr\": 0.03583901754736412,\n \"acc_norm\": 0.6705202312138728,\n \"acc_norm_stderr\": 0.03583901754736412\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4019607843137255,\n \"acc_stderr\": 0.04878608714466996,\n \"acc_norm\": 0.4019607843137255,\n \"acc_norm_stderr\": 0.04878608714466996\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5872340425531914,\n \"acc_stderr\": 0.03218471141400351,\n \"acc_norm\": 0.5872340425531914,\n \"acc_norm_stderr\": 0.03218471141400351\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.49122807017543857,\n \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.49122807017543857,\n \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5586206896551724,\n \"acc_stderr\": 0.04137931034482757,\n \"acc_norm\": 0.5586206896551724,\n \"acc_norm_stderr\": 0.04137931034482757\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.42857142857142855,\n \"acc_stderr\": 0.02548718714785938,\n \"acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.02548718714785938\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4365079365079365,\n \"acc_stderr\": 0.04435932892851466,\n \"acc_norm\": 0.4365079365079365,\n \"acc_norm_stderr\": 0.04435932892851466\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7774193548387097,\n \"acc_stderr\": 0.023664216671642518,\n \"acc_norm\": 0.7774193548387097,\n \"acc_norm_stderr\": 0.023664216671642518\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5073891625615764,\n \"acc_stderr\": 0.035176035403610105,\n \"acc_norm\": 0.5073891625615764,\n \"acc_norm_stderr\": 0.035176035403610105\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7575757575757576,\n \"acc_stderr\": 0.03346409881055953,\n \"acc_norm\": 0.7575757575757576,\n \"acc_norm_stderr\": 0.03346409881055953\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7929292929292929,\n \"acc_stderr\": 0.028869778460267045,\n \"acc_norm\": 0.7929292929292929,\n \"acc_norm_stderr\": 0.028869778460267045\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8963730569948186,\n \"acc_stderr\": 0.02199531196364424,\n \"acc_norm\": 0.8963730569948186,\n \"acc_norm_stderr\": 0.02199531196364424\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6743589743589744,\n \"acc_stderr\": 0.02375966576741229,\n \"acc_norm\": 0.6743589743589744,\n \"acc_norm_stderr\": 0.02375966576741229\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3814814814814815,\n \"acc_stderr\": 0.029616718927497586,\n \"acc_norm\": 0.3814814814814815,\n \"acc_norm_stderr\": 0.029616718927497586\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.680672268907563,\n \"acc_stderr\": 0.030283995525884396,\n \"acc_norm\": 0.680672268907563,\n \"acc_norm_stderr\": 0.030283995525884396\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3576158940397351,\n \"acc_stderr\": 0.03913453431177258,\n \"acc_norm\": 0.3576158940397351,\n \"acc_norm_stderr\": 0.03913453431177258\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8366972477064221,\n \"acc_stderr\": 0.01584825580650155,\n \"acc_norm\": 0.8366972477064221,\n \"acc_norm_stderr\": 0.01584825580650155\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5462962962962963,\n \"acc_stderr\": 0.033953227263757976,\n \"acc_norm\": 0.5462962962962963,\n \"acc_norm_stderr\": 0.033953227263757976\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8186274509803921,\n \"acc_stderr\": 0.027044621719474082,\n \"acc_norm\": 0.8186274509803921,\n \"acc_norm_stderr\": 0.027044621719474082\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7721518987341772,\n \"acc_stderr\": 0.027303484599069436,\n \"acc_norm\": 0.7721518987341772,\n \"acc_norm_stderr\": 0.027303484599069436\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8091603053435115,\n \"acc_stderr\": 0.03446513350752599,\n \"acc_norm\": 0.8091603053435115,\n \"acc_norm_stderr\": 0.03446513350752599\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8016528925619835,\n \"acc_stderr\": 0.03640118271990947,\n \"acc_norm\": 0.8016528925619835,\n \"acc_norm_stderr\": 0.03640118271990947\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n \"acc_stderr\": 0.0395783547198098,\n \"acc_norm\": 0.7870370370370371,\n \"acc_norm_stderr\": 0.0395783547198098\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7668711656441718,\n \"acc_stderr\": 0.0332201579577674,\n \"acc_norm\": 0.7668711656441718,\n \"acc_norm_stderr\": 0.0332201579577674\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5178571428571429,\n \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.5178571428571429,\n \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7864077669902912,\n \"acc_stderr\": 0.04058042015646034,\n \"acc_norm\": 0.7864077669902912,\n \"acc_norm_stderr\": 0.04058042015646034\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8717948717948718,\n \"acc_stderr\": 0.02190190511507333,\n \"acc_norm\": 0.8717948717948718,\n \"acc_norm_stderr\": 0.02190190511507333\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.73,\n \"acc_stderr\": 0.0446196043338474,\n \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8263090676883781,\n \"acc_stderr\": 0.01354741565866226,\n \"acc_norm\": 0.8263090676883781,\n \"acc_norm_stderr\": 0.01354741565866226\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7341040462427746,\n \"acc_stderr\": 0.023786203255508283,\n \"acc_norm\": 0.7341040462427746,\n \"acc_norm_stderr\": 0.023786203255508283\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3854748603351955,\n \"acc_stderr\": 0.016277927039638193,\n \"acc_norm\": 0.3854748603351955,\n \"acc_norm_stderr\": 0.016277927039638193\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7581699346405228,\n \"acc_stderr\": 0.024518195641879334,\n \"acc_norm\": 0.7581699346405228,\n \"acc_norm_stderr\": 0.024518195641879334\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7170418006430869,\n \"acc_stderr\": 0.02558306248998481,\n \"acc_norm\": 0.7170418006430869,\n \"acc_norm_stderr\": 0.02558306248998481\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7623456790123457,\n \"acc_stderr\": 0.023683591837008557,\n \"acc_norm\": 0.7623456790123457,\n \"acc_norm_stderr\": 0.023683591837008557\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.5035460992907801,\n \"acc_stderr\": 0.02982674915328092,\n \"acc_norm\": 0.5035460992907801,\n \"acc_norm_stderr\": 0.02982674915328092\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4654498044328553,\n \"acc_stderr\": 0.0127397115540457,\n \"acc_norm\": 0.4654498044328553,\n \"acc_norm_stderr\": 0.0127397115540457\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6801470588235294,\n \"acc_stderr\": 0.028332959514031208,\n \"acc_norm\": 0.6801470588235294,\n \"acc_norm_stderr\": 0.028332959514031208\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6977124183006536,\n \"acc_stderr\": 0.018579232711113877,\n \"acc_norm\": 0.6977124183006536,\n \"acc_norm_stderr\": 0.018579232711113877\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.746938775510204,\n \"acc_stderr\": 0.02783302387139968,\n \"acc_norm\": 0.746938775510204,\n \"acc_norm_stderr\": 0.02783302387139968\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8557213930348259,\n \"acc_stderr\": 0.024845753212306046,\n \"acc_norm\": 0.8557213930348259,\n \"acc_norm_stderr\": 0.024845753212306046\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.88,\n \"acc_stderr\": 0.03265986323710906,\n \"acc_norm\": 0.88,\n \"acc_norm_stderr\": 0.03265986323710906\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5301204819277109,\n \"acc_stderr\": 0.03885425420866767,\n \"acc_norm\": 0.5301204819277109,\n \"acc_norm_stderr\": 0.03885425420866767\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.35862913096695226,\n \"mc1_stderr\": 0.016789289499502025,\n \"mc2\": 0.5255271961828023,\n \"mc2_stderr\": 0.014972145811572106\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8153117600631413,\n \"acc_stderr\": 0.01090597811215688\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6262319939347991,\n \"acc_stderr\": 0.013326342860737006\n }\n}\n```", "repo_url": "https://huggingface.co/Gille/StrangeMerges_12-7B-slerp", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_01T21_17_50.815553", "path": ["**/details_harness|arc:challenge|25_2024-02-01T21-17-50.815553.parquet"]}, {"split": "2024_02_02T02_19_18.956358", "path": ["**/details_harness|arc:challenge|25_2024-02-02T02-19-18.956358.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-02T02-19-18.956358.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_01T21_17_50.815553", "path": ["**/details_harness|gsm8k|5_2024-02-01T21-17-50.815553.parquet"]}, {"split": "2024_02_02T02_19_18.956358", "path": ["**/details_harness|gsm8k|5_2024-02-02T02-19-18.956358.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-02T02-19-18.956358.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_01T21_17_50.815553", "path": ["**/details_harness|hellaswag|10_2024-02-01T21-17-50.815553.parquet"]}, {"split": "2024_02_02T02_19_18.956358", "path": ["**/details_harness|hellaswag|10_2024-02-02T02-19-18.956358.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-02T02-19-18.956358.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_01T21_17_50.815553", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T21-17-50.815553.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-01T21-17-50.815553.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-01T21-17-50.815553.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T21-17-50.815553.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T21-17-50.815553.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-01T21-17-50.815553.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T21-17-50.815553.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T21-17-50.815553.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T21-17-50.815553.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T21-17-50.815553.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-01T21-17-50.815553.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-01T21-17-50.815553.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T21-17-50.815553.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-01T21-17-50.815553.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T21-17-50.815553.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T21-17-50.815553.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T21-17-50.815553.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-01T21-17-50.815553.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T21-17-50.815553.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T21-17-50.815553.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T21-17-50.815553.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T21-17-50.815553.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T21-17-50.815553.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T21-17-50.815553.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T21-17-50.815553.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T21-17-50.815553.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T21-17-50.815553.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T21-17-50.815553.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T21-17-50.815553.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T21-17-50.815553.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T21-17-50.815553.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T21-17-50.815553.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-01T21-17-50.815553.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T21-17-50.815553.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-01T21-17-50.815553.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T21-17-50.815553.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T21-17-50.815553.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T21-17-50.815553.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-01T21-17-50.815553.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-01T21-17-50.815553.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T21-17-50.815553.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T21-17-50.815553.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T21-17-50.815553.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T21-17-50.815553.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-01T21-17-50.815553.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-01T21-17-50.815553.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-01T21-17-50.815553.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T21-17-50.815553.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-01T21-17-50.815553.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T21-17-50.815553.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T21-17-50.815553.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-01T21-17-50.815553.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-01T21-17-50.815553.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-01T21-17-50.815553.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T21-17-50.815553.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-01T21-17-50.815553.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-01T21-17-50.815553.parquet"]}, {"split": "2024_02_02T02_19_18.956358", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T02-19-18.956358.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-02T02-19-18.956358.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-02T02-19-18.956358.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T02-19-18.956358.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T02-19-18.956358.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-02T02-19-18.956358.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T02-19-18.956358.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T02-19-18.956358.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T02-19-18.956358.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T02-19-18.956358.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-02T02-19-18.956358.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-02T02-19-18.956358.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T02-19-18.956358.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-02T02-19-18.956358.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T02-19-18.956358.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T02-19-18.956358.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T02-19-18.956358.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-02T02-19-18.956358.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T02-19-18.956358.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T02-19-18.956358.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T02-19-18.956358.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T02-19-18.956358.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T02-19-18.956358.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T02-19-18.956358.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T02-19-18.956358.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T02-19-18.956358.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T02-19-18.956358.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T02-19-18.956358.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T02-19-18.956358.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T02-19-18.956358.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T02-19-18.956358.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T02-19-18.956358.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-02T02-19-18.956358.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T02-19-18.956358.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-02T02-19-18.956358.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T02-19-18.956358.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T02-19-18.956358.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T02-19-18.956358.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-02T02-19-18.956358.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-02T02-19-18.956358.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T02-19-18.956358.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T02-19-18.956358.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T02-19-18.956358.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T02-19-18.956358.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-02T02-19-18.956358.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-02T02-19-18.956358.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-02T02-19-18.956358.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T02-19-18.956358.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-02T02-19-18.956358.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T02-19-18.956358.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T02-19-18.956358.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-02T02-19-18.956358.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-02T02-19-18.956358.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-02T02-19-18.956358.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T02-19-18.956358.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-02T02-19-18.956358.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-02T02-19-18.956358.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T02-19-18.956358.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-02T02-19-18.956358.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-02T02-19-18.956358.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T02-19-18.956358.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T02-19-18.956358.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-02T02-19-18.956358.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T02-19-18.956358.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T02-19-18.956358.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T02-19-18.956358.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T02-19-18.956358.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-02T02-19-18.956358.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-02T02-19-18.956358.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T02-19-18.956358.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-02T02-19-18.956358.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T02-19-18.956358.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T02-19-18.956358.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T02-19-18.956358.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-02T02-19-18.956358.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T02-19-18.956358.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T02-19-18.956358.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T02-19-18.956358.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T02-19-18.956358.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T02-19-18.956358.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T02-19-18.956358.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T02-19-18.956358.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T02-19-18.956358.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T02-19-18.956358.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T02-19-18.956358.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T02-19-18.956358.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T02-19-18.956358.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T02-19-18.956358.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T02-19-18.956358.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-02T02-19-18.956358.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T02-19-18.956358.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-02T02-19-18.956358.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T02-19-18.956358.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T02-19-18.956358.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T02-19-18.956358.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-02T02-19-18.956358.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-02T02-19-18.956358.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T02-19-18.956358.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T02-19-18.956358.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T02-19-18.956358.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T02-19-18.956358.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-02T02-19-18.956358.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-02T02-19-18.956358.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-02T02-19-18.956358.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T02-19-18.956358.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-02T02-19-18.956358.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T02-19-18.956358.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T02-19-18.956358.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-02T02-19-18.956358.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-02T02-19-18.956358.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-02T02-19-18.956358.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T02-19-18.956358.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-02T02-19-18.956358.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-02T02-19-18.956358.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_01T21_17_50.815553", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T21-17-50.815553.parquet"]}, {"split": "2024_02_02T02_19_18.956358", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T02-19-18.956358.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T02-19-18.956358.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_01T21_17_50.815553", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-01T21-17-50.815553.parquet"]}, {"split": "2024_02_02T02_19_18.956358", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-02T02-19-18.956358.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-02T02-19-18.956358.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_01T21_17_50.815553", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-01T21-17-50.815553.parquet"]}, {"split": "2024_02_02T02_19_18.956358", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-02T02-19-18.956358.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-02T02-19-18.956358.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_01T21_17_50.815553", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T21-17-50.815553.parquet"]}, {"split": "2024_02_02T02_19_18.956358", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T02-19-18.956358.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T02-19-18.956358.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_01T21_17_50.815553", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T21-17-50.815553.parquet"]}, {"split": "2024_02_02T02_19_18.956358", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T02-19-18.956358.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T02-19-18.956358.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_01T21_17_50.815553", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-01T21-17-50.815553.parquet"]}, {"split": "2024_02_02T02_19_18.956358", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-02T02-19-18.956358.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-02T02-19-18.956358.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_01T21_17_50.815553", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T21-17-50.815553.parquet"]}, {"split": "2024_02_02T02_19_18.956358", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T02-19-18.956358.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T02-19-18.956358.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_01T21_17_50.815553", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T21-17-50.815553.parquet"]}, {"split": "2024_02_02T02_19_18.956358", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T02-19-18.956358.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T02-19-18.956358.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_01T21_17_50.815553", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T21-17-50.815553.parquet"]}, {"split": "2024_02_02T02_19_18.956358", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T02-19-18.956358.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T02-19-18.956358.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_01T21_17_50.815553", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T21-17-50.815553.parquet"]}, {"split": "2024_02_02T02_19_18.956358", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T02-19-18.956358.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T02-19-18.956358.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_01T21_17_50.815553", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-01T21-17-50.815553.parquet"]}, {"split": "2024_02_02T02_19_18.956358", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-02T02-19-18.956358.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-02T02-19-18.956358.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_01T21_17_50.815553", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-01T21-17-50.815553.parquet"]}, {"split": "2024_02_02T02_19_18.956358", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-02T02-19-18.956358.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-02T02-19-18.956358.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_01T21_17_50.815553", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T21-17-50.815553.parquet"]}, {"split": "2024_02_02T02_19_18.956358", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T02-19-18.956358.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T02-19-18.956358.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_01T21_17_50.815553", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-01T21-17-50.815553.parquet"]}, {"split": "2024_02_02T02_19_18.956358", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-02T02-19-18.956358.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-02T02-19-18.956358.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_01T21_17_50.815553", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T21-17-50.815553.parquet"]}, {"split": "2024_02_02T02_19_18.956358", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T02-19-18.956358.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T02-19-18.956358.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_01T21_17_50.815553", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T21-17-50.815553.parquet"]}, {"split": "2024_02_02T02_19_18.956358", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T02-19-18.956358.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T02-19-18.956358.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_01T21_17_50.815553", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T21-17-50.815553.parquet"]}, {"split": "2024_02_02T02_19_18.956358", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T02-19-18.956358.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T02-19-18.956358.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_01T21_17_50.815553", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-01T21-17-50.815553.parquet"]}, {"split": "2024_02_02T02_19_18.956358", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-02T02-19-18.956358.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-02T02-19-18.956358.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_01T21_17_50.815553", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T21-17-50.815553.parquet"]}, {"split": "2024_02_02T02_19_18.956358", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T02-19-18.956358.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T02-19-18.956358.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_01T21_17_50.815553", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T21-17-50.815553.parquet"]}, {"split": "2024_02_02T02_19_18.956358", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T02-19-18.956358.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T02-19-18.956358.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_01T21_17_50.815553", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T21-17-50.815553.parquet"]}, {"split": "2024_02_02T02_19_18.956358", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T02-19-18.956358.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T02-19-18.956358.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_01T21_17_50.815553", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T21-17-50.815553.parquet"]}, {"split": "2024_02_02T02_19_18.956358", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T02-19-18.956358.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T02-19-18.956358.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_01T21_17_50.815553", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T21-17-50.815553.parquet"]}, {"split": "2024_02_02T02_19_18.956358", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T02-19-18.956358.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T02-19-18.956358.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_01T21_17_50.815553", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T21-17-50.815553.parquet"]}, {"split": "2024_02_02T02_19_18.956358", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T02-19-18.956358.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T02-19-18.956358.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_01T21_17_50.815553", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T21-17-50.815553.parquet"]}, {"split": "2024_02_02T02_19_18.956358", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T02-19-18.956358.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T02-19-18.956358.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_01T21_17_50.815553", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T21-17-50.815553.parquet"]}, {"split": "2024_02_02T02_19_18.956358", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T02-19-18.956358.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T02-19-18.956358.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_01T21_17_50.815553", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T21-17-50.815553.parquet"]}, {"split": "2024_02_02T02_19_18.956358", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T02-19-18.956358.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T02-19-18.956358.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_01T21_17_50.815553", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T21-17-50.815553.parquet"]}, {"split": "2024_02_02T02_19_18.956358", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T02-19-18.956358.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T02-19-18.956358.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_01T21_17_50.815553", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T21-17-50.815553.parquet"]}, {"split": "2024_02_02T02_19_18.956358", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T02-19-18.956358.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T02-19-18.956358.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_01T21_17_50.815553", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T21-17-50.815553.parquet"]}, {"split": "2024_02_02T02_19_18.956358", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T02-19-18.956358.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T02-19-18.956358.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_01T21_17_50.815553", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T21-17-50.815553.parquet"]}, {"split": "2024_02_02T02_19_18.956358", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T02-19-18.956358.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T02-19-18.956358.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_01T21_17_50.815553", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T21-17-50.815553.parquet"]}, {"split": "2024_02_02T02_19_18.956358", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T02-19-18.956358.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T02-19-18.956358.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_01T21_17_50.815553", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-01T21-17-50.815553.parquet"]}, {"split": "2024_02_02T02_19_18.956358", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-02T02-19-18.956358.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-02T02-19-18.956358.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_01T21_17_50.815553", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T21-17-50.815553.parquet"]}, {"split": "2024_02_02T02_19_18.956358", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T02-19-18.956358.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T02-19-18.956358.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_01T21_17_50.815553", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-01T21-17-50.815553.parquet"]}, {"split": "2024_02_02T02_19_18.956358", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-02T02-19-18.956358.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-02T02-19-18.956358.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_01T21_17_50.815553", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T21-17-50.815553.parquet"]}, {"split": "2024_02_02T02_19_18.956358", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T02-19-18.956358.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T02-19-18.956358.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_01T21_17_50.815553", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T21-17-50.815553.parquet"]}, {"split": "2024_02_02T02_19_18.956358", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T02-19-18.956358.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T02-19-18.956358.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_01T21_17_50.815553", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T21-17-50.815553.parquet"]}, {"split": "2024_02_02T02_19_18.956358", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T02-19-18.956358.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T02-19-18.956358.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_01T21_17_50.815553", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-01T21-17-50.815553.parquet"]}, {"split": "2024_02_02T02_19_18.956358", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-02T02-19-18.956358.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-02T02-19-18.956358.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_01T21_17_50.815553", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-01T21-17-50.815553.parquet"]}, {"split": "2024_02_02T02_19_18.956358", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-02T02-19-18.956358.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-02T02-19-18.956358.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_01T21_17_50.815553", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T21-17-50.815553.parquet"]}, {"split": "2024_02_02T02_19_18.956358", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T02-19-18.956358.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T02-19-18.956358.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_01T21_17_50.815553", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T21-17-50.815553.parquet"]}, {"split": "2024_02_02T02_19_18.956358", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T02-19-18.956358.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T02-19-18.956358.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_01T21_17_50.815553", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T21-17-50.815553.parquet"]}, {"split": "2024_02_02T02_19_18.956358", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T02-19-18.956358.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T02-19-18.956358.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_01T21_17_50.815553", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T21-17-50.815553.parquet"]}, {"split": "2024_02_02T02_19_18.956358", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T02-19-18.956358.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T02-19-18.956358.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_01T21_17_50.815553", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-01T21-17-50.815553.parquet"]}, {"split": "2024_02_02T02_19_18.956358", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-02T02-19-18.956358.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-02T02-19-18.956358.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_01T21_17_50.815553", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-01T21-17-50.815553.parquet"]}, {"split": "2024_02_02T02_19_18.956358", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-02T02-19-18.956358.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-02T02-19-18.956358.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_01T21_17_50.815553", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-01T21-17-50.815553.parquet"]}, {"split": "2024_02_02T02_19_18.956358", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-02T02-19-18.956358.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-02T02-19-18.956358.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_01T21_17_50.815553", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T21-17-50.815553.parquet"]}, {"split": "2024_02_02T02_19_18.956358", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T02-19-18.956358.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T02-19-18.956358.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_01T21_17_50.815553", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-01T21-17-50.815553.parquet"]}, {"split": "2024_02_02T02_19_18.956358", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-02T02-19-18.956358.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-02T02-19-18.956358.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_01T21_17_50.815553", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T21-17-50.815553.parquet"]}, {"split": "2024_02_02T02_19_18.956358", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T02-19-18.956358.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T02-19-18.956358.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_01T21_17_50.815553", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T21-17-50.815553.parquet"]}, {"split": "2024_02_02T02_19_18.956358", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T02-19-18.956358.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T02-19-18.956358.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_01T21_17_50.815553", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-01T21-17-50.815553.parquet"]}, {"split": "2024_02_02T02_19_18.956358", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-02T02-19-18.956358.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-02T02-19-18.956358.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_01T21_17_50.815553", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-01T21-17-50.815553.parquet"]}, {"split": "2024_02_02T02_19_18.956358", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-02T02-19-18.956358.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-02T02-19-18.956358.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_01T21_17_50.815553", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-01T21-17-50.815553.parquet"]}, {"split": "2024_02_02T02_19_18.956358", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-02T02-19-18.956358.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-02T02-19-18.956358.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_01T21_17_50.815553", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T21-17-50.815553.parquet"]}, {"split": "2024_02_02T02_19_18.956358", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T02-19-18.956358.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T02-19-18.956358.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_01T21_17_50.815553", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-01T21-17-50.815553.parquet"]}, {"split": "2024_02_02T02_19_18.956358", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-02T02-19-18.956358.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-02T02-19-18.956358.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_01T21_17_50.815553", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-01T21-17-50.815553.parquet"]}, {"split": "2024_02_02T02_19_18.956358", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-02T02-19-18.956358.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-02T02-19-18.956358.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_01T21_17_50.815553", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-01T21-17-50.815553.parquet"]}, {"split": "2024_02_02T02_19_18.956358", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-02T02-19-18.956358.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-02T02-19-18.956358.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_01T21_17_50.815553", "path": ["**/details_harness|winogrande|5_2024-02-01T21-17-50.815553.parquet"]}, {"split": "2024_02_02T02_19_18.956358", "path": ["**/details_harness|winogrande|5_2024-02-02T02-19-18.956358.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-02T02-19-18.956358.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_01T21_17_50.815553", "path": ["results_2024-02-01T21-17-50.815553.parquet"]}, {"split": "2024_02_02T02_19_18.956358", "path": ["results_2024-02-02T02-19-18.956358.parquet"]}, {"split": "latest", "path": ["results_2024-02-02T02-19-18.956358.parquet"]}]}]}
2024-02-02T02:22:03+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of Gille/StrangeMerges_12-7B-slerp Dataset automatically created during the evaluation run of model Gille/StrangeMerges_12-7B-slerp on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-02-02T02:19:18.956358(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of Gille/StrangeMerges_12-7B-slerp\n\n\n\nDataset automatically created during the evaluation run of model Gille/StrangeMerges_12-7B-slerp on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-02-02T02:19:18.956358(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of Gille/StrangeMerges_12-7B-slerp\n\n\n\nDataset automatically created during the evaluation run of model Gille/StrangeMerges_12-7B-slerp on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-02-02T02:19:18.956358(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
34716583d233f2644ea8a31d96ca35f7950ef6ba
# Dataset Card for Evaluation run of Aryanne/Westest-7B <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [Aryanne/Westest-7B](https://huggingface.co/Aryanne/Westest-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_Aryanne__Westest-7B", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-02-01T21:38:37.177147](https://huggingface.co/datasets/open-llm-leaderboard/details_Aryanne__Westest-7B/blob/main/results_2024-02-01T21-38-37.177147.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6502517737747912, "acc_stderr": 0.032209239127293725, "acc_norm": 0.6496642643906977, "acc_norm_stderr": 0.032890187869919804, "mc1": 0.5324357405140759, "mc1_stderr": 0.017466632149577613, "mc2": 0.6672312243823757, "mc2_stderr": 0.015360712744265235 }, "harness|arc:challenge|25": { "acc": 0.7013651877133106, "acc_stderr": 0.013374078615068742, "acc_norm": 0.7218430034129693, "acc_norm_stderr": 0.01309446991953881 }, "harness|hellaswag|10": { "acc": 0.7176857199761004, "acc_stderr": 0.004492055279407108, "acc_norm": 0.8851822346146186, "acc_norm_stderr": 0.003181503506054323 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.33, "acc_stderr": 0.04725815626252604, "acc_norm": 0.33, "acc_norm_stderr": 0.04725815626252604 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.6444444444444445, "acc_stderr": 0.04135176749720385, "acc_norm": 0.6444444444444445, "acc_norm_stderr": 0.04135176749720385 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.7171052631578947, "acc_stderr": 0.03665349695640767, "acc_norm": 0.7171052631578947, "acc_norm_stderr": 0.03665349695640767 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.64, "acc_stderr": 0.04824181513244218, "acc_norm": 0.64, "acc_norm_stderr": 0.04824181513244218 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.6943396226415094, "acc_stderr": 0.028353298073322663, "acc_norm": 0.6943396226415094, "acc_norm_stderr": 0.028353298073322663 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.7638888888888888, "acc_stderr": 0.03551446610810826, "acc_norm": 0.7638888888888888, "acc_norm_stderr": 0.03551446610810826 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.48, "acc_stderr": 0.050211673156867795, "acc_norm": 0.48, "acc_norm_stderr": 0.050211673156867795 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.51, "acc_stderr": 0.05024183937956911, "acc_norm": 0.51, "acc_norm_stderr": 0.05024183937956911 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.32, "acc_stderr": 0.04688261722621504, "acc_norm": 0.32, "acc_norm_stderr": 0.04688261722621504 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6473988439306358, "acc_stderr": 0.036430371689585475, "acc_norm": 0.6473988439306358, "acc_norm_stderr": 0.036430371689585475 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.3627450980392157, "acc_stderr": 0.047840607041056527, "acc_norm": 0.3627450980392157, "acc_norm_stderr": 0.047840607041056527 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.72, "acc_stderr": 0.04512608598542127, "acc_norm": 0.72, "acc_norm_stderr": 0.04512608598542127 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.5574468085106383, "acc_stderr": 0.032469569197899575, "acc_norm": 0.5574468085106383, "acc_norm_stderr": 0.032469569197899575 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.5175438596491229, "acc_stderr": 0.04700708033551038, "acc_norm": 0.5175438596491229, "acc_norm_stderr": 0.04700708033551038 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.6068965517241379, "acc_stderr": 0.0407032901370707, "acc_norm": 0.6068965517241379, "acc_norm_stderr": 0.0407032901370707 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.43386243386243384, "acc_stderr": 0.025525034382474894, "acc_norm": 0.43386243386243384, "acc_norm_stderr": 0.025525034382474894 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.46825396825396826, "acc_stderr": 0.04463112720677171, "acc_norm": 0.46825396825396826, "acc_norm_stderr": 0.04463112720677171 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.37, "acc_stderr": 0.04852365870939099, "acc_norm": 0.37, "acc_norm_stderr": 0.04852365870939099 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.7870967741935484, "acc_stderr": 0.02328766512726854, "acc_norm": 0.7870967741935484, "acc_norm_stderr": 0.02328766512726854 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.49261083743842365, "acc_stderr": 0.035176035403610084, "acc_norm": 0.49261083743842365, "acc_norm_stderr": 0.035176035403610084 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.69, "acc_stderr": 0.04648231987117316, "acc_norm": 0.69, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.806060606060606, "acc_stderr": 0.030874145136562094, "acc_norm": 0.806060606060606, "acc_norm_stderr": 0.030874145136562094 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.8080808080808081, "acc_stderr": 0.028057791672989017, "acc_norm": 0.8080808080808081, "acc_norm_stderr": 0.028057791672989017 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.9067357512953368, "acc_stderr": 0.020986854593289733, "acc_norm": 0.9067357512953368, "acc_norm_stderr": 0.020986854593289733 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.6538461538461539, "acc_stderr": 0.02412112541694119, "acc_norm": 0.6538461538461539, "acc_norm_stderr": 0.02412112541694119 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.3592592592592593, "acc_stderr": 0.029252905927251972, "acc_norm": 0.3592592592592593, "acc_norm_stderr": 0.029252905927251972 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.6722689075630253, "acc_stderr": 0.03048991141767323, "acc_norm": 0.6722689075630253, "acc_norm_stderr": 0.03048991141767323 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.37748344370860926, "acc_stderr": 0.0395802723112157, "acc_norm": 0.37748344370860926, "acc_norm_stderr": 0.0395802723112157 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8275229357798165, "acc_stderr": 0.01619780795684803, "acc_norm": 0.8275229357798165, "acc_norm_stderr": 0.01619780795684803 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.5416666666666666, "acc_stderr": 0.03398110890294636, "acc_norm": 0.5416666666666666, "acc_norm_stderr": 0.03398110890294636 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.8382352941176471, "acc_stderr": 0.025845017986926917, "acc_norm": 0.8382352941176471, "acc_norm_stderr": 0.025845017986926917 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.7848101265822784, "acc_stderr": 0.02675082699467617, "acc_norm": 0.7848101265822784, "acc_norm_stderr": 0.02675082699467617 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.672645739910314, "acc_stderr": 0.03149384670994131, "acc_norm": 0.672645739910314, "acc_norm_stderr": 0.03149384670994131 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.7709923664122137, "acc_stderr": 0.036853466317118506, "acc_norm": 0.7709923664122137, "acc_norm_stderr": 0.036853466317118506 }, "harness|hendrycksTest-international_law|5": { "acc": 0.8016528925619835, "acc_stderr": 0.03640118271990946, "acc_norm": 0.8016528925619835, "acc_norm_stderr": 0.03640118271990946 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7777777777777778, "acc_stderr": 0.0401910747255735, "acc_norm": 0.7777777777777778, "acc_norm_stderr": 0.0401910747255735 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7668711656441718, "acc_stderr": 0.0332201579577674, "acc_norm": 0.7668711656441718, "acc_norm_stderr": 0.0332201579577674 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.4642857142857143, "acc_stderr": 0.04733667890053756, "acc_norm": 0.4642857142857143, "acc_norm_stderr": 0.04733667890053756 }, "harness|hendrycksTest-management|5": { "acc": 0.7766990291262136, "acc_stderr": 0.04123553189891431, "acc_norm": 0.7766990291262136, "acc_norm_stderr": 0.04123553189891431 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8931623931623932, "acc_stderr": 0.02023714900899093, "acc_norm": 0.8931623931623932, "acc_norm_stderr": 0.02023714900899093 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.7, "acc_stderr": 0.046056618647183814, "acc_norm": 0.7, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8275862068965517, "acc_stderr": 0.013507943909371803, "acc_norm": 0.8275862068965517, "acc_norm_stderr": 0.013507943909371803 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.7398843930635838, "acc_stderr": 0.023618678310069356, "acc_norm": 0.7398843930635838, "acc_norm_stderr": 0.023618678310069356 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.37094972067039106, "acc_stderr": 0.016155910721341774, "acc_norm": 0.37094972067039106, "acc_norm_stderr": 0.016155910721341774 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.7026143790849673, "acc_stderr": 0.02617390850671858, "acc_norm": 0.7026143790849673, "acc_norm_stderr": 0.02617390850671858 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.707395498392283, "acc_stderr": 0.02583989833487798, "acc_norm": 0.707395498392283, "acc_norm_stderr": 0.02583989833487798 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.7283950617283951, "acc_stderr": 0.02474862449053737, "acc_norm": 0.7283950617283951, "acc_norm_stderr": 0.02474862449053737 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.48226950354609927, "acc_stderr": 0.02980873964223777, "acc_norm": 0.48226950354609927, "acc_norm_stderr": 0.02980873964223777 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.4661016949152542, "acc_stderr": 0.012740853872949834, "acc_norm": 0.4661016949152542, "acc_norm_stderr": 0.012740853872949834 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.6617647058823529, "acc_stderr": 0.028739328513983572, "acc_norm": 0.6617647058823529, "acc_norm_stderr": 0.028739328513983572 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6666666666666666, "acc_stderr": 0.019070985589687495, "acc_norm": 0.6666666666666666, "acc_norm_stderr": 0.019070985589687495 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6818181818181818, "acc_stderr": 0.044612721759105085, "acc_norm": 0.6818181818181818, "acc_norm_stderr": 0.044612721759105085 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.710204081632653, "acc_stderr": 0.029043088683304328, "acc_norm": 0.710204081632653, "acc_norm_stderr": 0.029043088683304328 }, "harness|hendrycksTest-sociology|5": { "acc": 0.8208955223880597, "acc_stderr": 0.027113286753111837, "acc_norm": 0.8208955223880597, "acc_norm_stderr": 0.027113286753111837 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.83, "acc_stderr": 0.0377525168068637, "acc_norm": 0.83, "acc_norm_stderr": 0.0377525168068637 }, "harness|hendrycksTest-virology|5": { "acc": 0.5542168674698795, "acc_stderr": 0.03869543323472101, "acc_norm": 0.5542168674698795, "acc_norm_stderr": 0.03869543323472101 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8187134502923976, "acc_stderr": 0.029547741687640044, "acc_norm": 0.8187134502923976, "acc_norm_stderr": 0.029547741687640044 }, "harness|truthfulqa:mc|0": { "mc1": 0.5324357405140759, "mc1_stderr": 0.017466632149577613, "mc2": 0.6672312243823757, "mc2_stderr": 0.015360712744265235 }, "harness|winogrande|5": { "acc": 0.8658247829518547, "acc_stderr": 0.00957931173993858 }, "harness|gsm8k|5": { "acc": 0.6573161485974223, "acc_stderr": 0.01307303023082791 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_Aryanne__Westest-7B
[ "region:us" ]
2024-02-01T21:40:59+00:00
{"pretty_name": "Evaluation run of Aryanne/Westest-7B", "dataset_summary": "Dataset automatically created during the evaluation run of model [Aryanne/Westest-7B](https://huggingface.co/Aryanne/Westest-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Aryanne__Westest-7B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-01T21:38:37.177147](https://huggingface.co/datasets/open-llm-leaderboard/details_Aryanne__Westest-7B/blob/main/results_2024-02-01T21-38-37.177147.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6502517737747912,\n \"acc_stderr\": 0.032209239127293725,\n \"acc_norm\": 0.6496642643906977,\n \"acc_norm_stderr\": 0.032890187869919804,\n \"mc1\": 0.5324357405140759,\n \"mc1_stderr\": 0.017466632149577613,\n \"mc2\": 0.6672312243823757,\n \"mc2_stderr\": 0.015360712744265235\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.7013651877133106,\n \"acc_stderr\": 0.013374078615068742,\n \"acc_norm\": 0.7218430034129693,\n \"acc_norm_stderr\": 0.01309446991953881\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7176857199761004,\n \"acc_stderr\": 0.004492055279407108,\n \"acc_norm\": 0.8851822346146186,\n \"acc_norm_stderr\": 0.003181503506054323\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6444444444444445,\n \"acc_stderr\": 0.04135176749720385,\n \"acc_norm\": 0.6444444444444445,\n \"acc_norm_stderr\": 0.04135176749720385\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7171052631578947,\n \"acc_stderr\": 0.03665349695640767,\n \"acc_norm\": 0.7171052631578947,\n \"acc_norm_stderr\": 0.03665349695640767\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6943396226415094,\n \"acc_stderr\": 0.028353298073322663,\n \"acc_norm\": 0.6943396226415094,\n \"acc_norm_stderr\": 0.028353298073322663\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7638888888888888,\n \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.7638888888888888,\n \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956911,\n \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956911\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6473988439306358,\n \"acc_stderr\": 0.036430371689585475,\n \"acc_norm\": 0.6473988439306358,\n \"acc_norm_stderr\": 0.036430371689585475\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.3627450980392157,\n \"acc_stderr\": 0.047840607041056527,\n \"acc_norm\": 0.3627450980392157,\n \"acc_norm_stderr\": 0.047840607041056527\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5574468085106383,\n \"acc_stderr\": 0.032469569197899575,\n \"acc_norm\": 0.5574468085106383,\n \"acc_norm_stderr\": 0.032469569197899575\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5175438596491229,\n \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.5175438596491229,\n \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.6068965517241379,\n \"acc_stderr\": 0.0407032901370707,\n \"acc_norm\": 0.6068965517241379,\n \"acc_norm_stderr\": 0.0407032901370707\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.43386243386243384,\n \"acc_stderr\": 0.025525034382474894,\n \"acc_norm\": 0.43386243386243384,\n \"acc_norm_stderr\": 0.025525034382474894\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.46825396825396826,\n \"acc_stderr\": 0.04463112720677171,\n \"acc_norm\": 0.46825396825396826,\n \"acc_norm_stderr\": 0.04463112720677171\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7870967741935484,\n \"acc_stderr\": 0.02328766512726854,\n \"acc_norm\": 0.7870967741935484,\n \"acc_norm_stderr\": 0.02328766512726854\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.49261083743842365,\n \"acc_stderr\": 0.035176035403610084,\n \"acc_norm\": 0.49261083743842365,\n \"acc_norm_stderr\": 0.035176035403610084\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.806060606060606,\n \"acc_stderr\": 0.030874145136562094,\n \"acc_norm\": 0.806060606060606,\n \"acc_norm_stderr\": 0.030874145136562094\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.8080808080808081,\n \"acc_stderr\": 0.028057791672989017,\n \"acc_norm\": 0.8080808080808081,\n \"acc_norm_stderr\": 0.028057791672989017\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9067357512953368,\n \"acc_stderr\": 0.020986854593289733,\n \"acc_norm\": 0.9067357512953368,\n \"acc_norm_stderr\": 0.020986854593289733\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6538461538461539,\n \"acc_stderr\": 0.02412112541694119,\n \"acc_norm\": 0.6538461538461539,\n \"acc_norm_stderr\": 0.02412112541694119\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3592592592592593,\n \"acc_stderr\": 0.029252905927251972,\n \"acc_norm\": 0.3592592592592593,\n \"acc_norm_stderr\": 0.029252905927251972\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6722689075630253,\n \"acc_stderr\": 0.03048991141767323,\n \"acc_norm\": 0.6722689075630253,\n \"acc_norm_stderr\": 0.03048991141767323\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.37748344370860926,\n \"acc_stderr\": 0.0395802723112157,\n \"acc_norm\": 0.37748344370860926,\n \"acc_norm_stderr\": 0.0395802723112157\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8275229357798165,\n \"acc_stderr\": 0.01619780795684803,\n \"acc_norm\": 0.8275229357798165,\n \"acc_norm_stderr\": 0.01619780795684803\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5416666666666666,\n \"acc_stderr\": 0.03398110890294636,\n \"acc_norm\": 0.5416666666666666,\n \"acc_norm_stderr\": 0.03398110890294636\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8382352941176471,\n \"acc_stderr\": 0.025845017986926917,\n \"acc_norm\": 0.8382352941176471,\n \"acc_norm_stderr\": 0.025845017986926917\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7848101265822784,\n \"acc_stderr\": 0.02675082699467617,\n \"acc_norm\": 0.7848101265822784,\n \"acc_norm_stderr\": 0.02675082699467617\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.672645739910314,\n \"acc_stderr\": 0.03149384670994131,\n \"acc_norm\": 0.672645739910314,\n \"acc_norm_stderr\": 0.03149384670994131\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7709923664122137,\n \"acc_stderr\": 0.036853466317118506,\n \"acc_norm\": 0.7709923664122137,\n \"acc_norm_stderr\": 0.036853466317118506\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8016528925619835,\n \"acc_stderr\": 0.03640118271990946,\n \"acc_norm\": 0.8016528925619835,\n \"acc_norm_stderr\": 0.03640118271990946\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.0401910747255735,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.0401910747255735\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7668711656441718,\n \"acc_stderr\": 0.0332201579577674,\n \"acc_norm\": 0.7668711656441718,\n \"acc_norm_stderr\": 0.0332201579577674\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4642857142857143,\n \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.4642857142857143,\n \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8931623931623932,\n \"acc_stderr\": 0.02023714900899093,\n \"acc_norm\": 0.8931623931623932,\n \"acc_norm_stderr\": 0.02023714900899093\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8275862068965517,\n \"acc_stderr\": 0.013507943909371803,\n \"acc_norm\": 0.8275862068965517,\n \"acc_norm_stderr\": 0.013507943909371803\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7398843930635838,\n \"acc_stderr\": 0.023618678310069356,\n \"acc_norm\": 0.7398843930635838,\n \"acc_norm_stderr\": 0.023618678310069356\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.37094972067039106,\n \"acc_stderr\": 0.016155910721341774,\n \"acc_norm\": 0.37094972067039106,\n \"acc_norm_stderr\": 0.016155910721341774\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7026143790849673,\n \"acc_stderr\": 0.02617390850671858,\n \"acc_norm\": 0.7026143790849673,\n \"acc_norm_stderr\": 0.02617390850671858\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.707395498392283,\n \"acc_stderr\": 0.02583989833487798,\n \"acc_norm\": 0.707395498392283,\n \"acc_norm_stderr\": 0.02583989833487798\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7283950617283951,\n \"acc_stderr\": 0.02474862449053737,\n \"acc_norm\": 0.7283950617283951,\n \"acc_norm_stderr\": 0.02474862449053737\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.48226950354609927,\n \"acc_stderr\": 0.02980873964223777,\n \"acc_norm\": 0.48226950354609927,\n \"acc_norm_stderr\": 0.02980873964223777\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4661016949152542,\n \"acc_stderr\": 0.012740853872949834,\n \"acc_norm\": 0.4661016949152542,\n \"acc_norm_stderr\": 0.012740853872949834\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6617647058823529,\n \"acc_stderr\": 0.028739328513983572,\n \"acc_norm\": 0.6617647058823529,\n \"acc_norm_stderr\": 0.028739328513983572\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.019070985589687495,\n \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.019070985589687495\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n \"acc_stderr\": 0.044612721759105085,\n \"acc_norm\": 0.6818181818181818,\n \"acc_norm_stderr\": 0.044612721759105085\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.710204081632653,\n \"acc_stderr\": 0.029043088683304328,\n \"acc_norm\": 0.710204081632653,\n \"acc_norm_stderr\": 0.029043088683304328\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8208955223880597,\n \"acc_stderr\": 0.027113286753111837,\n \"acc_norm\": 0.8208955223880597,\n \"acc_norm_stderr\": 0.027113286753111837\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.83,\n \"acc_stderr\": 0.0377525168068637,\n \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.0377525168068637\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5542168674698795,\n \"acc_stderr\": 0.03869543323472101,\n \"acc_norm\": 0.5542168674698795,\n \"acc_norm_stderr\": 0.03869543323472101\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8187134502923976,\n \"acc_stderr\": 0.029547741687640044,\n \"acc_norm\": 0.8187134502923976,\n \"acc_norm_stderr\": 0.029547741687640044\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5324357405140759,\n \"mc1_stderr\": 0.017466632149577613,\n \"mc2\": 0.6672312243823757,\n \"mc2_stderr\": 0.015360712744265235\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8658247829518547,\n \"acc_stderr\": 0.00957931173993858\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6573161485974223,\n \"acc_stderr\": 0.01307303023082791\n }\n}\n```", "repo_url": "https://huggingface.co/Aryanne/Westest-7B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_01T21_38_37.177147", "path": ["**/details_harness|arc:challenge|25_2024-02-01T21-38-37.177147.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-01T21-38-37.177147.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_01T21_38_37.177147", "path": ["**/details_harness|gsm8k|5_2024-02-01T21-38-37.177147.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-01T21-38-37.177147.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_01T21_38_37.177147", "path": ["**/details_harness|hellaswag|10_2024-02-01T21-38-37.177147.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-01T21-38-37.177147.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_01T21_38_37.177147", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T21-38-37.177147.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-01T21-38-37.177147.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-01T21-38-37.177147.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T21-38-37.177147.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T21-38-37.177147.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-01T21-38-37.177147.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T21-38-37.177147.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T21-38-37.177147.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T21-38-37.177147.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T21-38-37.177147.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-01T21-38-37.177147.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-01T21-38-37.177147.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T21-38-37.177147.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-01T21-38-37.177147.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T21-38-37.177147.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T21-38-37.177147.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T21-38-37.177147.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-01T21-38-37.177147.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T21-38-37.177147.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T21-38-37.177147.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T21-38-37.177147.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T21-38-37.177147.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T21-38-37.177147.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T21-38-37.177147.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T21-38-37.177147.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T21-38-37.177147.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T21-38-37.177147.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T21-38-37.177147.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T21-38-37.177147.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T21-38-37.177147.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T21-38-37.177147.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T21-38-37.177147.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-01T21-38-37.177147.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T21-38-37.177147.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-01T21-38-37.177147.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T21-38-37.177147.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T21-38-37.177147.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T21-38-37.177147.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-01T21-38-37.177147.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-01T21-38-37.177147.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T21-38-37.177147.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T21-38-37.177147.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T21-38-37.177147.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T21-38-37.177147.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-01T21-38-37.177147.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-01T21-38-37.177147.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-01T21-38-37.177147.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T21-38-37.177147.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-01T21-38-37.177147.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T21-38-37.177147.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T21-38-37.177147.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-01T21-38-37.177147.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-01T21-38-37.177147.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-01T21-38-37.177147.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T21-38-37.177147.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-01T21-38-37.177147.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-01T21-38-37.177147.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T21-38-37.177147.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-01T21-38-37.177147.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-01T21-38-37.177147.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T21-38-37.177147.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T21-38-37.177147.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-01T21-38-37.177147.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T21-38-37.177147.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T21-38-37.177147.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T21-38-37.177147.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T21-38-37.177147.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-01T21-38-37.177147.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-01T21-38-37.177147.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T21-38-37.177147.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-01T21-38-37.177147.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T21-38-37.177147.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T21-38-37.177147.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T21-38-37.177147.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-01T21-38-37.177147.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T21-38-37.177147.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T21-38-37.177147.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T21-38-37.177147.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T21-38-37.177147.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T21-38-37.177147.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T21-38-37.177147.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T21-38-37.177147.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T21-38-37.177147.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T21-38-37.177147.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T21-38-37.177147.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T21-38-37.177147.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T21-38-37.177147.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T21-38-37.177147.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T21-38-37.177147.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-01T21-38-37.177147.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T21-38-37.177147.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-01T21-38-37.177147.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T21-38-37.177147.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T21-38-37.177147.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T21-38-37.177147.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-01T21-38-37.177147.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-01T21-38-37.177147.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T21-38-37.177147.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T21-38-37.177147.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T21-38-37.177147.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T21-38-37.177147.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-01T21-38-37.177147.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-01T21-38-37.177147.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-01T21-38-37.177147.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T21-38-37.177147.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-01T21-38-37.177147.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T21-38-37.177147.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T21-38-37.177147.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-01T21-38-37.177147.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-01T21-38-37.177147.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-01T21-38-37.177147.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T21-38-37.177147.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-01T21-38-37.177147.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-01T21-38-37.177147.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_01T21_38_37.177147", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T21-38-37.177147.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T21-38-37.177147.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_01T21_38_37.177147", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-01T21-38-37.177147.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-01T21-38-37.177147.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_01T21_38_37.177147", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-01T21-38-37.177147.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-01T21-38-37.177147.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_01T21_38_37.177147", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T21-38-37.177147.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T21-38-37.177147.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_01T21_38_37.177147", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T21-38-37.177147.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T21-38-37.177147.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_01T21_38_37.177147", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-01T21-38-37.177147.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-01T21-38-37.177147.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_01T21_38_37.177147", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T21-38-37.177147.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T21-38-37.177147.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_01T21_38_37.177147", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T21-38-37.177147.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T21-38-37.177147.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_01T21_38_37.177147", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T21-38-37.177147.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T21-38-37.177147.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_01T21_38_37.177147", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T21-38-37.177147.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T21-38-37.177147.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_01T21_38_37.177147", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-01T21-38-37.177147.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-01T21-38-37.177147.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_01T21_38_37.177147", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-01T21-38-37.177147.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-01T21-38-37.177147.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_01T21_38_37.177147", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T21-38-37.177147.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T21-38-37.177147.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_01T21_38_37.177147", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-01T21-38-37.177147.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-01T21-38-37.177147.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_01T21_38_37.177147", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T21-38-37.177147.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T21-38-37.177147.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_01T21_38_37.177147", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T21-38-37.177147.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T21-38-37.177147.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_01T21_38_37.177147", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T21-38-37.177147.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T21-38-37.177147.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_01T21_38_37.177147", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-01T21-38-37.177147.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-01T21-38-37.177147.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_01T21_38_37.177147", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T21-38-37.177147.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T21-38-37.177147.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_01T21_38_37.177147", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T21-38-37.177147.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T21-38-37.177147.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_01T21_38_37.177147", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T21-38-37.177147.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T21-38-37.177147.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_01T21_38_37.177147", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T21-38-37.177147.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T21-38-37.177147.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_01T21_38_37.177147", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T21-38-37.177147.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T21-38-37.177147.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_01T21_38_37.177147", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T21-38-37.177147.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T21-38-37.177147.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_01T21_38_37.177147", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T21-38-37.177147.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T21-38-37.177147.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_01T21_38_37.177147", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T21-38-37.177147.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T21-38-37.177147.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_01T21_38_37.177147", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T21-38-37.177147.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T21-38-37.177147.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_01T21_38_37.177147", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T21-38-37.177147.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T21-38-37.177147.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_01T21_38_37.177147", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T21-38-37.177147.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T21-38-37.177147.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_01T21_38_37.177147", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T21-38-37.177147.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T21-38-37.177147.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_01T21_38_37.177147", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T21-38-37.177147.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T21-38-37.177147.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_01T21_38_37.177147", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T21-38-37.177147.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T21-38-37.177147.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_01T21_38_37.177147", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-01T21-38-37.177147.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-01T21-38-37.177147.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_01T21_38_37.177147", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T21-38-37.177147.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T21-38-37.177147.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_01T21_38_37.177147", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-01T21-38-37.177147.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-01T21-38-37.177147.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_01T21_38_37.177147", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T21-38-37.177147.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T21-38-37.177147.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_01T21_38_37.177147", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T21-38-37.177147.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T21-38-37.177147.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_01T21_38_37.177147", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T21-38-37.177147.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T21-38-37.177147.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_01T21_38_37.177147", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-01T21-38-37.177147.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-01T21-38-37.177147.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_01T21_38_37.177147", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-01T21-38-37.177147.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-01T21-38-37.177147.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_01T21_38_37.177147", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T21-38-37.177147.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T21-38-37.177147.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_01T21_38_37.177147", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T21-38-37.177147.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T21-38-37.177147.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_01T21_38_37.177147", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T21-38-37.177147.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T21-38-37.177147.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_01T21_38_37.177147", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T21-38-37.177147.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T21-38-37.177147.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_01T21_38_37.177147", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-01T21-38-37.177147.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-01T21-38-37.177147.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_01T21_38_37.177147", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-01T21-38-37.177147.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-01T21-38-37.177147.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_01T21_38_37.177147", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-01T21-38-37.177147.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-01T21-38-37.177147.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_01T21_38_37.177147", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T21-38-37.177147.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T21-38-37.177147.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_01T21_38_37.177147", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-01T21-38-37.177147.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-01T21-38-37.177147.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_01T21_38_37.177147", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T21-38-37.177147.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T21-38-37.177147.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_01T21_38_37.177147", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T21-38-37.177147.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T21-38-37.177147.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_01T21_38_37.177147", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-01T21-38-37.177147.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-01T21-38-37.177147.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_01T21_38_37.177147", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-01T21-38-37.177147.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-01T21-38-37.177147.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_01T21_38_37.177147", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-01T21-38-37.177147.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-01T21-38-37.177147.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_01T21_38_37.177147", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T21-38-37.177147.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T21-38-37.177147.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_01T21_38_37.177147", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-01T21-38-37.177147.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-01T21-38-37.177147.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_01T21_38_37.177147", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-01T21-38-37.177147.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-01T21-38-37.177147.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_01T21_38_37.177147", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-01T21-38-37.177147.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-01T21-38-37.177147.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_01T21_38_37.177147", "path": ["**/details_harness|winogrande|5_2024-02-01T21-38-37.177147.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-01T21-38-37.177147.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_01T21_38_37.177147", "path": ["results_2024-02-01T21-38-37.177147.parquet"]}, {"split": "latest", "path": ["results_2024-02-01T21-38-37.177147.parquet"]}]}]}
2024-02-01T21:41:22+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of Aryanne/Westest-7B Dataset automatically created during the evaluation run of model Aryanne/Westest-7B on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-02-01T21:38:37.177147(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of Aryanne/Westest-7B\n\n\n\nDataset automatically created during the evaluation run of model Aryanne/Westest-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-02-01T21:38:37.177147(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of Aryanne/Westest-7B\n\n\n\nDataset automatically created during the evaluation run of model Aryanne/Westest-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-02-01T21:38:37.177147(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
5b1df89dbb20bab843789d50a5eb2223de2ed2a1
# Dataset Card for Evaluation run of Dans-DiscountModels/TinyMistral-v2.5-MiniPile-Guidelines-E1 <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [Dans-DiscountModels/TinyMistral-v2.5-MiniPile-Guidelines-E1](https://huggingface.co/Dans-DiscountModels/TinyMistral-v2.5-MiniPile-Guidelines-E1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_Dans-DiscountModels__TinyMistral-v2.5-MiniPile-Guidelines-E1", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-02-01T21:44:48.551629](https://huggingface.co/datasets/open-llm-leaderboard/details_Dans-DiscountModels__TinyMistral-v2.5-MiniPile-Guidelines-E1/blob/main/results_2024-02-01T21-44-48.551629.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.235616909983485, "acc_stderr": 0.030096196864815804, "acc_norm": 0.23617812988980863, "acc_norm_stderr": 0.030895352600212644, "mc1": 0.2460220318237454, "mc1_stderr": 0.015077219200662578, "mc2": 0.49848823283731625, "mc2_stderr": 0.016449164481650215 }, "harness|arc:challenge|25": { "acc": 0.2090443686006826, "acc_stderr": 0.011882746987406458, "acc_norm": 0.2645051194539249, "acc_norm_stderr": 0.012889272949313366 }, "harness|hellaswag|10": { "acc": 0.25632344154550885, "acc_stderr": 0.004357101984278612, "acc_norm": 0.2568213503286198, "acc_norm_stderr": 0.00435987151963954 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.24, "acc_stderr": 0.042923469599092816, "acc_norm": 0.24, "acc_norm_stderr": 0.042923469599092816 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.18518518518518517, "acc_stderr": 0.03355677216313141, "acc_norm": 0.18518518518518517, "acc_norm_stderr": 0.03355677216313141 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.17763157894736842, "acc_stderr": 0.031103182383123398, "acc_norm": 0.17763157894736842, "acc_norm_stderr": 0.031103182383123398 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.3, "acc_stderr": 0.046056618647183814, "acc_norm": 0.3, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.24150943396226415, "acc_stderr": 0.026341480371118376, "acc_norm": 0.24150943396226415, "acc_norm_stderr": 0.026341480371118376 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.25, "acc_stderr": 0.03621034121889507, "acc_norm": 0.25, "acc_norm_stderr": 0.03621034121889507 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.23, "acc_stderr": 0.04229525846816507, "acc_norm": 0.23, "acc_norm_stderr": 0.04229525846816507 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.25, "acc_stderr": 0.04351941398892446, "acc_norm": 0.25, "acc_norm_stderr": 0.04351941398892446 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.21, "acc_stderr": 0.040936018074033256, "acc_norm": 0.21, "acc_norm_stderr": 0.040936018074033256 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.2023121387283237, "acc_stderr": 0.03063114553919882, "acc_norm": 0.2023121387283237, "acc_norm_stderr": 0.03063114553919882 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.21568627450980393, "acc_stderr": 0.04092563958237654, "acc_norm": 0.21568627450980393, "acc_norm_stderr": 0.04092563958237654 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.29, "acc_stderr": 0.04560480215720684, "acc_norm": 0.29, "acc_norm_stderr": 0.04560480215720684 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.26382978723404255, "acc_stderr": 0.028809989854102973, "acc_norm": 0.26382978723404255, "acc_norm_stderr": 0.028809989854102973 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.21052631578947367, "acc_stderr": 0.038351539543994194, "acc_norm": 0.21052631578947367, "acc_norm_stderr": 0.038351539543994194 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.2413793103448276, "acc_stderr": 0.03565998174135302, "acc_norm": 0.2413793103448276, "acc_norm_stderr": 0.03565998174135302 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.20899470899470898, "acc_stderr": 0.02094048156533486, "acc_norm": 0.20899470899470898, "acc_norm_stderr": 0.02094048156533486 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.2857142857142857, "acc_stderr": 0.04040610178208841, "acc_norm": 0.2857142857142857, "acc_norm_stderr": 0.04040610178208841 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.19, "acc_stderr": 0.03942772444036625, "acc_norm": 0.19, "acc_norm_stderr": 0.03942772444036625 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.1774193548387097, "acc_stderr": 0.02173254068932927, "acc_norm": 0.1774193548387097, "acc_norm_stderr": 0.02173254068932927 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.18719211822660098, "acc_stderr": 0.027444924966882618, "acc_norm": 0.18719211822660098, "acc_norm_stderr": 0.027444924966882618 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.25, "acc_stderr": 0.04351941398892446, "acc_norm": 0.25, "acc_norm_stderr": 0.04351941398892446 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.21818181818181817, "acc_stderr": 0.03225078108306289, "acc_norm": 0.21818181818181817, "acc_norm_stderr": 0.03225078108306289 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.2222222222222222, "acc_stderr": 0.029620227874790482, "acc_norm": 0.2222222222222222, "acc_norm_stderr": 0.029620227874790482 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.19689119170984457, "acc_stderr": 0.028697873971860664, "acc_norm": 0.19689119170984457, "acc_norm_stderr": 0.028697873971860664 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.2, "acc_stderr": 0.020280805062535722, "acc_norm": 0.2, "acc_norm_stderr": 0.020280805062535722 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.2111111111111111, "acc_stderr": 0.024882116857655075, "acc_norm": 0.2111111111111111, "acc_norm_stderr": 0.024882116857655075 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.20588235294117646, "acc_stderr": 0.026265024608275882, "acc_norm": 0.20588235294117646, "acc_norm_stderr": 0.026265024608275882 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.19205298013245034, "acc_stderr": 0.032162984205936135, "acc_norm": 0.19205298013245034, "acc_norm_stderr": 0.032162984205936135 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.1889908256880734, "acc_stderr": 0.016785481159203634, "acc_norm": 0.1889908256880734, "acc_norm_stderr": 0.016785481159203634 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.14814814814814814, "acc_stderr": 0.024227629273728356, "acc_norm": 0.14814814814814814, "acc_norm_stderr": 0.024227629273728356 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.25, "acc_stderr": 0.03039153369274154, "acc_norm": 0.25, "acc_norm_stderr": 0.03039153369274154 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.270042194092827, "acc_stderr": 0.028900721906293426, "acc_norm": 0.270042194092827, "acc_norm_stderr": 0.028900721906293426 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.3004484304932735, "acc_stderr": 0.030769352008229143, "acc_norm": 0.3004484304932735, "acc_norm_stderr": 0.030769352008229143 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.22137404580152673, "acc_stderr": 0.0364129708131373, "acc_norm": 0.22137404580152673, "acc_norm_stderr": 0.0364129708131373 }, "harness|hendrycksTest-international_law|5": { "acc": 0.2396694214876033, "acc_stderr": 0.03896878985070417, "acc_norm": 0.2396694214876033, "acc_norm_stderr": 0.03896878985070417 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.25925925925925924, "acc_stderr": 0.042365112580946336, "acc_norm": 0.25925925925925924, "acc_norm_stderr": 0.042365112580946336 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.22085889570552147, "acc_stderr": 0.032591773927421776, "acc_norm": 0.22085889570552147, "acc_norm_stderr": 0.032591773927421776 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.3125, "acc_stderr": 0.043994650575715215, "acc_norm": 0.3125, "acc_norm_stderr": 0.043994650575715215 }, "harness|hendrycksTest-management|5": { "acc": 0.17475728155339806, "acc_stderr": 0.037601780060266224, "acc_norm": 0.17475728155339806, "acc_norm_stderr": 0.037601780060266224 }, "harness|hendrycksTest-marketing|5": { "acc": 0.2905982905982906, "acc_stderr": 0.02974504857267404, "acc_norm": 0.2905982905982906, "acc_norm_stderr": 0.02974504857267404 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.31, "acc_stderr": 0.04648231987117316, "acc_norm": 0.31, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.24393358876117496, "acc_stderr": 0.015357212665829489, "acc_norm": 0.24393358876117496, "acc_norm_stderr": 0.015357212665829489 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.24855491329479767, "acc_stderr": 0.023267528432100174, "acc_norm": 0.24855491329479767, "acc_norm_stderr": 0.023267528432100174 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.23798882681564246, "acc_stderr": 0.014242630070574915, "acc_norm": 0.23798882681564246, "acc_norm_stderr": 0.014242630070574915 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.2581699346405229, "acc_stderr": 0.025058503316958157, "acc_norm": 0.2581699346405229, "acc_norm_stderr": 0.025058503316958157 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.1864951768488746, "acc_stderr": 0.02212243977248077, "acc_norm": 0.1864951768488746, "acc_norm_stderr": 0.02212243977248077 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.21604938271604937, "acc_stderr": 0.022899162918445806, "acc_norm": 0.21604938271604937, "acc_norm_stderr": 0.022899162918445806 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.23404255319148937, "acc_stderr": 0.025257861359432417, "acc_norm": 0.23404255319148937, "acc_norm_stderr": 0.025257861359432417 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.2457627118644068, "acc_stderr": 0.010996156635142692, "acc_norm": 0.2457627118644068, "acc_norm_stderr": 0.010996156635142692 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.3235294117647059, "acc_stderr": 0.02841820861940679, "acc_norm": 0.3235294117647059, "acc_norm_stderr": 0.02841820861940679 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.25, "acc_stderr": 0.01751781884501444, "acc_norm": 0.25, "acc_norm_stderr": 0.01751781884501444 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.21818181818181817, "acc_stderr": 0.03955932861795833, "acc_norm": 0.21818181818181817, "acc_norm_stderr": 0.03955932861795833 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.18775510204081633, "acc_stderr": 0.02500025603954621, "acc_norm": 0.18775510204081633, "acc_norm_stderr": 0.02500025603954621 }, "harness|hendrycksTest-sociology|5": { "acc": 0.24378109452736318, "acc_stderr": 0.03036049015401465, "acc_norm": 0.24378109452736318, "acc_norm_stderr": 0.03036049015401465 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.28, "acc_stderr": 0.04512608598542128, "acc_norm": 0.28, "acc_norm_stderr": 0.04512608598542128 }, "harness|hendrycksTest-virology|5": { "acc": 0.2710843373493976, "acc_stderr": 0.034605799075530276, "acc_norm": 0.2710843373493976, "acc_norm_stderr": 0.034605799075530276 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.32748538011695905, "acc_stderr": 0.035993357714560276, "acc_norm": 0.32748538011695905, "acc_norm_stderr": 0.035993357714560276 }, "harness|truthfulqa:mc|0": { "mc1": 0.2460220318237454, "mc1_stderr": 0.015077219200662578, "mc2": 0.49848823283731625, "mc2_stderr": 0.016449164481650215 }, "harness|winogrande|5": { "acc": 0.4940805051302289, "acc_stderr": 0.014051500838485807 }, "harness|gsm8k|5": { "acc": 0.0, "acc_stderr": 0.0 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_Dans-DiscountModels__TinyMistral-v2.5-MiniPile-Guidelines-E1
[ "region:us" ]
2024-02-01T21:47:08+00:00
{"pretty_name": "Evaluation run of Dans-DiscountModels/TinyMistral-v2.5-MiniPile-Guidelines-E1", "dataset_summary": "Dataset automatically created during the evaluation run of model [Dans-DiscountModels/TinyMistral-v2.5-MiniPile-Guidelines-E1](https://huggingface.co/Dans-DiscountModels/TinyMistral-v2.5-MiniPile-Guidelines-E1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Dans-DiscountModels__TinyMistral-v2.5-MiniPile-Guidelines-E1\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-01T21:44:48.551629](https://huggingface.co/datasets/open-llm-leaderboard/details_Dans-DiscountModels__TinyMistral-v2.5-MiniPile-Guidelines-E1/blob/main/results_2024-02-01T21-44-48.551629.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.235616909983485,\n \"acc_stderr\": 0.030096196864815804,\n \"acc_norm\": 0.23617812988980863,\n \"acc_norm_stderr\": 0.030895352600212644,\n \"mc1\": 0.2460220318237454,\n \"mc1_stderr\": 0.015077219200662578,\n \"mc2\": 0.49848823283731625,\n \"mc2_stderr\": 0.016449164481650215\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.2090443686006826,\n \"acc_stderr\": 0.011882746987406458,\n \"acc_norm\": 0.2645051194539249,\n \"acc_norm_stderr\": 0.012889272949313366\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.25632344154550885,\n \"acc_stderr\": 0.004357101984278612,\n \"acc_norm\": 0.2568213503286198,\n \"acc_norm_stderr\": 0.00435987151963954\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.24,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.18518518518518517,\n \"acc_stderr\": 0.03355677216313141,\n \"acc_norm\": 0.18518518518518517,\n \"acc_norm_stderr\": 0.03355677216313141\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.17763157894736842,\n \"acc_stderr\": 0.031103182383123398,\n \"acc_norm\": 0.17763157894736842,\n \"acc_norm_stderr\": 0.031103182383123398\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.24150943396226415,\n \"acc_stderr\": 0.026341480371118376,\n \"acc_norm\": 0.24150943396226415,\n \"acc_norm_stderr\": 0.026341480371118376\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.03621034121889507,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.03621034121889507\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816507,\n \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816507\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.2023121387283237,\n \"acc_stderr\": 0.03063114553919882,\n \"acc_norm\": 0.2023121387283237,\n \"acc_norm_stderr\": 0.03063114553919882\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.04092563958237654,\n \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.04092563958237654\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.26382978723404255,\n \"acc_stderr\": 0.028809989854102973,\n \"acc_norm\": 0.26382978723404255,\n \"acc_norm_stderr\": 0.028809989854102973\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.21052631578947367,\n \"acc_stderr\": 0.038351539543994194,\n \"acc_norm\": 0.21052631578947367,\n \"acc_norm_stderr\": 0.038351539543994194\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.2413793103448276,\n \"acc_stderr\": 0.03565998174135302,\n \"acc_norm\": 0.2413793103448276,\n \"acc_norm_stderr\": 0.03565998174135302\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.20899470899470898,\n \"acc_stderr\": 0.02094048156533486,\n \"acc_norm\": 0.20899470899470898,\n \"acc_norm_stderr\": 0.02094048156533486\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2857142857142857,\n \"acc_stderr\": 0.04040610178208841,\n \"acc_norm\": 0.2857142857142857,\n \"acc_norm_stderr\": 0.04040610178208841\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.19,\n \"acc_stderr\": 0.03942772444036625,\n \"acc_norm\": 0.19,\n \"acc_norm_stderr\": 0.03942772444036625\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.1774193548387097,\n \"acc_stderr\": 0.02173254068932927,\n \"acc_norm\": 0.1774193548387097,\n \"acc_norm_stderr\": 0.02173254068932927\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.18719211822660098,\n \"acc_stderr\": 0.027444924966882618,\n \"acc_norm\": 0.18719211822660098,\n \"acc_norm_stderr\": 0.027444924966882618\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03225078108306289,\n \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03225078108306289\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.2222222222222222,\n \"acc_stderr\": 0.029620227874790482,\n \"acc_norm\": 0.2222222222222222,\n \"acc_norm_stderr\": 0.029620227874790482\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.19689119170984457,\n \"acc_stderr\": 0.028697873971860664,\n \"acc_norm\": 0.19689119170984457,\n \"acc_norm_stderr\": 0.028697873971860664\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.2,\n \"acc_stderr\": 0.020280805062535722,\n \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.020280805062535722\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.2111111111111111,\n \"acc_stderr\": 0.024882116857655075,\n \"acc_norm\": 0.2111111111111111,\n \"acc_norm_stderr\": 0.024882116857655075\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.20588235294117646,\n \"acc_stderr\": 0.026265024608275882,\n \"acc_norm\": 0.20588235294117646,\n \"acc_norm_stderr\": 0.026265024608275882\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.19205298013245034,\n \"acc_stderr\": 0.032162984205936135,\n \"acc_norm\": 0.19205298013245034,\n \"acc_norm_stderr\": 0.032162984205936135\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.1889908256880734,\n \"acc_stderr\": 0.016785481159203634,\n \"acc_norm\": 0.1889908256880734,\n \"acc_norm_stderr\": 0.016785481159203634\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.14814814814814814,\n \"acc_stderr\": 0.024227629273728356,\n \"acc_norm\": 0.14814814814814814,\n \"acc_norm_stderr\": 0.024227629273728356\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.03039153369274154,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.03039153369274154\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.270042194092827,\n \"acc_stderr\": 0.028900721906293426,\n \"acc_norm\": 0.270042194092827,\n \"acc_norm_stderr\": 0.028900721906293426\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.3004484304932735,\n \"acc_stderr\": 0.030769352008229143,\n \"acc_norm\": 0.3004484304932735,\n \"acc_norm_stderr\": 0.030769352008229143\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.22137404580152673,\n \"acc_stderr\": 0.0364129708131373,\n \"acc_norm\": 0.22137404580152673,\n \"acc_norm_stderr\": 0.0364129708131373\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.2396694214876033,\n \"acc_stderr\": 0.03896878985070417,\n \"acc_norm\": 0.2396694214876033,\n \"acc_norm_stderr\": 0.03896878985070417\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.25925925925925924,\n \"acc_stderr\": 0.042365112580946336,\n \"acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.042365112580946336\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.22085889570552147,\n \"acc_stderr\": 0.032591773927421776,\n \"acc_norm\": 0.22085889570552147,\n \"acc_norm_stderr\": 0.032591773927421776\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3125,\n \"acc_stderr\": 0.043994650575715215,\n \"acc_norm\": 0.3125,\n \"acc_norm_stderr\": 0.043994650575715215\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.17475728155339806,\n \"acc_stderr\": 0.037601780060266224,\n \"acc_norm\": 0.17475728155339806,\n \"acc_norm_stderr\": 0.037601780060266224\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2905982905982906,\n \"acc_stderr\": 0.02974504857267404,\n \"acc_norm\": 0.2905982905982906,\n \"acc_norm_stderr\": 0.02974504857267404\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.24393358876117496,\n \"acc_stderr\": 0.015357212665829489,\n \"acc_norm\": 0.24393358876117496,\n \"acc_norm_stderr\": 0.015357212665829489\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.24855491329479767,\n \"acc_stderr\": 0.023267528432100174,\n \"acc_norm\": 0.24855491329479767,\n \"acc_norm_stderr\": 0.023267528432100174\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23798882681564246,\n \"acc_stderr\": 0.014242630070574915,\n \"acc_norm\": 0.23798882681564246,\n \"acc_norm_stderr\": 0.014242630070574915\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.2581699346405229,\n \"acc_stderr\": 0.025058503316958157,\n \"acc_norm\": 0.2581699346405229,\n \"acc_norm_stderr\": 0.025058503316958157\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.1864951768488746,\n \"acc_stderr\": 0.02212243977248077,\n \"acc_norm\": 0.1864951768488746,\n \"acc_norm_stderr\": 0.02212243977248077\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.21604938271604937,\n \"acc_stderr\": 0.022899162918445806,\n \"acc_norm\": 0.21604938271604937,\n \"acc_norm_stderr\": 0.022899162918445806\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.23404255319148937,\n \"acc_stderr\": 0.025257861359432417,\n \"acc_norm\": 0.23404255319148937,\n \"acc_norm_stderr\": 0.025257861359432417\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2457627118644068,\n \"acc_stderr\": 0.010996156635142692,\n \"acc_norm\": 0.2457627118644068,\n \"acc_norm_stderr\": 0.010996156635142692\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.3235294117647059,\n \"acc_stderr\": 0.02841820861940679,\n \"acc_norm\": 0.3235294117647059,\n \"acc_norm_stderr\": 0.02841820861940679\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.01751781884501444,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.01751781884501444\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03955932861795833,\n \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03955932861795833\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.18775510204081633,\n \"acc_stderr\": 0.02500025603954621,\n \"acc_norm\": 0.18775510204081633,\n \"acc_norm_stderr\": 0.02500025603954621\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.24378109452736318,\n \"acc_stderr\": 0.03036049015401465,\n \"acc_norm\": 0.24378109452736318,\n \"acc_norm_stderr\": 0.03036049015401465\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.2710843373493976,\n \"acc_stderr\": 0.034605799075530276,\n \"acc_norm\": 0.2710843373493976,\n \"acc_norm_stderr\": 0.034605799075530276\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.32748538011695905,\n \"acc_stderr\": 0.035993357714560276,\n \"acc_norm\": 0.32748538011695905,\n \"acc_norm_stderr\": 0.035993357714560276\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2460220318237454,\n \"mc1_stderr\": 0.015077219200662578,\n \"mc2\": 0.49848823283731625,\n \"mc2_stderr\": 0.016449164481650215\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.4940805051302289,\n \"acc_stderr\": 0.014051500838485807\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n }\n}\n```", "repo_url": "https://huggingface.co/Dans-DiscountModels/TinyMistral-v2.5-MiniPile-Guidelines-E1", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_01T21_44_48.551629", "path": ["**/details_harness|arc:challenge|25_2024-02-01T21-44-48.551629.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-01T21-44-48.551629.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_01T21_44_48.551629", "path": ["**/details_harness|gsm8k|5_2024-02-01T21-44-48.551629.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-01T21-44-48.551629.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_01T21_44_48.551629", "path": ["**/details_harness|hellaswag|10_2024-02-01T21-44-48.551629.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-01T21-44-48.551629.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_01T21_44_48.551629", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T21-44-48.551629.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-01T21-44-48.551629.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-01T21-44-48.551629.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T21-44-48.551629.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T21-44-48.551629.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-01T21-44-48.551629.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T21-44-48.551629.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T21-44-48.551629.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T21-44-48.551629.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T21-44-48.551629.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-01T21-44-48.551629.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-01T21-44-48.551629.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T21-44-48.551629.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-01T21-44-48.551629.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T21-44-48.551629.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T21-44-48.551629.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T21-44-48.551629.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-01T21-44-48.551629.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T21-44-48.551629.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T21-44-48.551629.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T21-44-48.551629.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T21-44-48.551629.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T21-44-48.551629.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T21-44-48.551629.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T21-44-48.551629.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T21-44-48.551629.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T21-44-48.551629.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T21-44-48.551629.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T21-44-48.551629.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T21-44-48.551629.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T21-44-48.551629.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T21-44-48.551629.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-01T21-44-48.551629.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T21-44-48.551629.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-01T21-44-48.551629.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T21-44-48.551629.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T21-44-48.551629.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T21-44-48.551629.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-01T21-44-48.551629.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-01T21-44-48.551629.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T21-44-48.551629.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T21-44-48.551629.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T21-44-48.551629.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T21-44-48.551629.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-01T21-44-48.551629.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-01T21-44-48.551629.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-01T21-44-48.551629.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T21-44-48.551629.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-01T21-44-48.551629.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T21-44-48.551629.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T21-44-48.551629.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-01T21-44-48.551629.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-01T21-44-48.551629.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-01T21-44-48.551629.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T21-44-48.551629.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-01T21-44-48.551629.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-01T21-44-48.551629.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T21-44-48.551629.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-01T21-44-48.551629.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-01T21-44-48.551629.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T21-44-48.551629.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T21-44-48.551629.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-01T21-44-48.551629.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T21-44-48.551629.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T21-44-48.551629.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T21-44-48.551629.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T21-44-48.551629.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-01T21-44-48.551629.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-01T21-44-48.551629.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T21-44-48.551629.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-01T21-44-48.551629.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T21-44-48.551629.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T21-44-48.551629.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T21-44-48.551629.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-01T21-44-48.551629.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T21-44-48.551629.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T21-44-48.551629.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T21-44-48.551629.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T21-44-48.551629.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T21-44-48.551629.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T21-44-48.551629.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T21-44-48.551629.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T21-44-48.551629.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T21-44-48.551629.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T21-44-48.551629.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T21-44-48.551629.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T21-44-48.551629.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T21-44-48.551629.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T21-44-48.551629.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-01T21-44-48.551629.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T21-44-48.551629.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-01T21-44-48.551629.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T21-44-48.551629.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T21-44-48.551629.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T21-44-48.551629.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-01T21-44-48.551629.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-01T21-44-48.551629.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T21-44-48.551629.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T21-44-48.551629.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T21-44-48.551629.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T21-44-48.551629.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-01T21-44-48.551629.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-01T21-44-48.551629.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-01T21-44-48.551629.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T21-44-48.551629.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-01T21-44-48.551629.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T21-44-48.551629.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T21-44-48.551629.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-01T21-44-48.551629.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-01T21-44-48.551629.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-01T21-44-48.551629.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T21-44-48.551629.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-01T21-44-48.551629.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-01T21-44-48.551629.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_01T21_44_48.551629", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T21-44-48.551629.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T21-44-48.551629.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_01T21_44_48.551629", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-01T21-44-48.551629.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-01T21-44-48.551629.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_01T21_44_48.551629", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-01T21-44-48.551629.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-01T21-44-48.551629.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_01T21_44_48.551629", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T21-44-48.551629.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T21-44-48.551629.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_01T21_44_48.551629", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T21-44-48.551629.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T21-44-48.551629.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_01T21_44_48.551629", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-01T21-44-48.551629.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-01T21-44-48.551629.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_01T21_44_48.551629", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T21-44-48.551629.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T21-44-48.551629.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_01T21_44_48.551629", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T21-44-48.551629.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T21-44-48.551629.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_01T21_44_48.551629", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T21-44-48.551629.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T21-44-48.551629.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_01T21_44_48.551629", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T21-44-48.551629.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T21-44-48.551629.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_01T21_44_48.551629", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-01T21-44-48.551629.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-01T21-44-48.551629.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_01T21_44_48.551629", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-01T21-44-48.551629.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-01T21-44-48.551629.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_01T21_44_48.551629", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T21-44-48.551629.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T21-44-48.551629.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_01T21_44_48.551629", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-01T21-44-48.551629.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-01T21-44-48.551629.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_01T21_44_48.551629", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T21-44-48.551629.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T21-44-48.551629.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_01T21_44_48.551629", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T21-44-48.551629.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T21-44-48.551629.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_01T21_44_48.551629", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T21-44-48.551629.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T21-44-48.551629.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_01T21_44_48.551629", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-01T21-44-48.551629.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-01T21-44-48.551629.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_01T21_44_48.551629", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T21-44-48.551629.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T21-44-48.551629.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_01T21_44_48.551629", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T21-44-48.551629.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T21-44-48.551629.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_01T21_44_48.551629", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T21-44-48.551629.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T21-44-48.551629.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_01T21_44_48.551629", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T21-44-48.551629.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T21-44-48.551629.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_01T21_44_48.551629", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T21-44-48.551629.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T21-44-48.551629.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_01T21_44_48.551629", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T21-44-48.551629.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T21-44-48.551629.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_01T21_44_48.551629", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T21-44-48.551629.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T21-44-48.551629.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_01T21_44_48.551629", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T21-44-48.551629.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T21-44-48.551629.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_01T21_44_48.551629", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T21-44-48.551629.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T21-44-48.551629.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_01T21_44_48.551629", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T21-44-48.551629.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T21-44-48.551629.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_01T21_44_48.551629", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T21-44-48.551629.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T21-44-48.551629.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_01T21_44_48.551629", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T21-44-48.551629.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T21-44-48.551629.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_01T21_44_48.551629", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T21-44-48.551629.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T21-44-48.551629.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_01T21_44_48.551629", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T21-44-48.551629.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T21-44-48.551629.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_01T21_44_48.551629", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-01T21-44-48.551629.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-01T21-44-48.551629.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_01T21_44_48.551629", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T21-44-48.551629.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T21-44-48.551629.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_01T21_44_48.551629", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-01T21-44-48.551629.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-01T21-44-48.551629.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_01T21_44_48.551629", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T21-44-48.551629.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T21-44-48.551629.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_01T21_44_48.551629", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T21-44-48.551629.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T21-44-48.551629.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_01T21_44_48.551629", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T21-44-48.551629.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T21-44-48.551629.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_01T21_44_48.551629", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-01T21-44-48.551629.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-01T21-44-48.551629.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_01T21_44_48.551629", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-01T21-44-48.551629.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-01T21-44-48.551629.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_01T21_44_48.551629", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T21-44-48.551629.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T21-44-48.551629.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_01T21_44_48.551629", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T21-44-48.551629.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T21-44-48.551629.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_01T21_44_48.551629", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T21-44-48.551629.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T21-44-48.551629.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_01T21_44_48.551629", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T21-44-48.551629.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T21-44-48.551629.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_01T21_44_48.551629", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-01T21-44-48.551629.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-01T21-44-48.551629.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_01T21_44_48.551629", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-01T21-44-48.551629.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-01T21-44-48.551629.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_01T21_44_48.551629", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-01T21-44-48.551629.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-01T21-44-48.551629.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_01T21_44_48.551629", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T21-44-48.551629.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T21-44-48.551629.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_01T21_44_48.551629", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-01T21-44-48.551629.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-01T21-44-48.551629.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_01T21_44_48.551629", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T21-44-48.551629.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T21-44-48.551629.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_01T21_44_48.551629", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T21-44-48.551629.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T21-44-48.551629.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_01T21_44_48.551629", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-01T21-44-48.551629.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-01T21-44-48.551629.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_01T21_44_48.551629", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-01T21-44-48.551629.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-01T21-44-48.551629.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_01T21_44_48.551629", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-01T21-44-48.551629.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-01T21-44-48.551629.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_01T21_44_48.551629", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T21-44-48.551629.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T21-44-48.551629.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_01T21_44_48.551629", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-01T21-44-48.551629.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-01T21-44-48.551629.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_01T21_44_48.551629", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-01T21-44-48.551629.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-01T21-44-48.551629.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_01T21_44_48.551629", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-01T21-44-48.551629.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-01T21-44-48.551629.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_01T21_44_48.551629", "path": ["**/details_harness|winogrande|5_2024-02-01T21-44-48.551629.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-01T21-44-48.551629.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_01T21_44_48.551629", "path": ["results_2024-02-01T21-44-48.551629.parquet"]}, {"split": "latest", "path": ["results_2024-02-01T21-44-48.551629.parquet"]}]}]}
2024-02-01T21:47:34+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of Dans-DiscountModels/TinyMistral-v2.5-MiniPile-Guidelines-E1 Dataset automatically created during the evaluation run of model Dans-DiscountModels/TinyMistral-v2.5-MiniPile-Guidelines-E1 on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-02-01T21:44:48.551629(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of Dans-DiscountModels/TinyMistral-v2.5-MiniPile-Guidelines-E1\n\n\n\nDataset automatically created during the evaluation run of model Dans-DiscountModels/TinyMistral-v2.5-MiniPile-Guidelines-E1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-02-01T21:44:48.551629(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of Dans-DiscountModels/TinyMistral-v2.5-MiniPile-Guidelines-E1\n\n\n\nDataset automatically created during the evaluation run of model Dans-DiscountModels/TinyMistral-v2.5-MiniPile-Guidelines-E1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-02-01T21:44:48.551629(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
cc14fa2628fda20760393c581b1796f2d2dd3ba0
# Dataset Card for Evaluation run of shadowml/WestBeagle-7B <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [shadowml/WestBeagle-7B](https://huggingface.co/shadowml/WestBeagle-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_shadowml__WestBeagle-7B", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-02-01T21:44:58.611202](https://huggingface.co/datasets/open-llm-leaderboard/details_shadowml__WestBeagle-7B/blob/main/results_2024-02-01T21-44-58.611202.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6571571064004633, "acc_stderr": 0.031991367682673535, "acc_norm": 0.656816938015433, "acc_norm_stderr": 0.03265435140721583, "mc1": 0.5740514075887393, "mc1_stderr": 0.01731047190407654, "mc2": 0.7170804043560985, "mc2_stderr": 0.01453935317893205 }, "harness|arc:challenge|25": { "acc": 0.6979522184300341, "acc_stderr": 0.013417519144716417, "acc_norm": 0.7226962457337884, "acc_norm_stderr": 0.013082095839059376 }, "harness|hellaswag|10": { "acc": 0.7032463652658832, "acc_stderr": 0.004558933822995549, "acc_norm": 0.8828918542123083, "acc_norm_stderr": 0.0032089195103099334 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.35, "acc_stderr": 0.0479372485441102, "acc_norm": 0.35, "acc_norm_stderr": 0.0479372485441102 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.674074074074074, "acc_stderr": 0.040491220417025055, "acc_norm": 0.674074074074074, "acc_norm_stderr": 0.040491220417025055 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.6973684210526315, "acc_stderr": 0.03738520676119669, "acc_norm": 0.6973684210526315, "acc_norm_stderr": 0.03738520676119669 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.63, "acc_stderr": 0.04852365870939099, "acc_norm": 0.63, "acc_norm_stderr": 0.04852365870939099 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.7018867924528301, "acc_stderr": 0.028152837942493864, "acc_norm": 0.7018867924528301, "acc_norm_stderr": 0.028152837942493864 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.7916666666666666, "acc_stderr": 0.033961162058453336, "acc_norm": 0.7916666666666666, "acc_norm_stderr": 0.033961162058453336 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.52, "acc_stderr": 0.050211673156867795, "acc_norm": 0.52, "acc_norm_stderr": 0.050211673156867795 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.54, "acc_stderr": 0.05009082659620333, "acc_norm": 0.54, "acc_norm_stderr": 0.05009082659620333 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.32, "acc_stderr": 0.04688261722621504, "acc_norm": 0.32, "acc_norm_stderr": 0.04688261722621504 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6820809248554913, "acc_stderr": 0.0355068398916558, "acc_norm": 0.6820809248554913, "acc_norm_stderr": 0.0355068398916558 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.4411764705882353, "acc_stderr": 0.049406356306056595, "acc_norm": 0.4411764705882353, "acc_norm_stderr": 0.049406356306056595 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.76, "acc_stderr": 0.04292346959909284, "acc_norm": 0.76, "acc_norm_stderr": 0.04292346959909284 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.5787234042553191, "acc_stderr": 0.03227834510146268, "acc_norm": 0.5787234042553191, "acc_norm_stderr": 0.03227834510146268 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.45614035087719296, "acc_stderr": 0.046854730419077895, "acc_norm": 0.45614035087719296, "acc_norm_stderr": 0.046854730419077895 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5655172413793104, "acc_stderr": 0.04130740879555498, "acc_norm": 0.5655172413793104, "acc_norm_stderr": 0.04130740879555498 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.3941798941798942, "acc_stderr": 0.02516798233389414, "acc_norm": 0.3941798941798942, "acc_norm_stderr": 0.02516798233389414 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.4603174603174603, "acc_stderr": 0.04458029125470973, "acc_norm": 0.4603174603174603, "acc_norm_stderr": 0.04458029125470973 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.33, "acc_stderr": 0.04725815626252604, "acc_norm": 0.33, "acc_norm_stderr": 0.04725815626252604 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.7903225806451613, "acc_stderr": 0.023157879349083525, "acc_norm": 0.7903225806451613, "acc_norm_stderr": 0.023157879349083525 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.5024630541871922, "acc_stderr": 0.035179450386910616, "acc_norm": 0.5024630541871922, "acc_norm_stderr": 0.035179450386910616 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.71, "acc_stderr": 0.045604802157206845, "acc_norm": 0.71, "acc_norm_stderr": 0.045604802157206845 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7757575757575758, "acc_stderr": 0.03256866661681102, "acc_norm": 0.7757575757575758, "acc_norm_stderr": 0.03256866661681102 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.7828282828282829, "acc_stderr": 0.029376616484945633, "acc_norm": 0.7828282828282829, "acc_norm_stderr": 0.029376616484945633 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.9015544041450777, "acc_stderr": 0.021500249576033456, "acc_norm": 0.9015544041450777, "acc_norm_stderr": 0.021500249576033456 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.6820512820512821, "acc_stderr": 0.023610884308927865, "acc_norm": 0.6820512820512821, "acc_norm_stderr": 0.023610884308927865 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.3296296296296296, "acc_stderr": 0.02866120111652457, "acc_norm": 0.3296296296296296, "acc_norm_stderr": 0.02866120111652457 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.6890756302521008, "acc_stderr": 0.030066761582977934, "acc_norm": 0.6890756302521008, "acc_norm_stderr": 0.030066761582977934 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.3576158940397351, "acc_stderr": 0.03913453431177258, "acc_norm": 0.3576158940397351, "acc_norm_stderr": 0.03913453431177258 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8458715596330275, "acc_stderr": 0.015480826865374303, "acc_norm": 0.8458715596330275, "acc_norm_stderr": 0.015480826865374303 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.5324074074074074, "acc_stderr": 0.03402801581358966, "acc_norm": 0.5324074074074074, "acc_norm_stderr": 0.03402801581358966 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.8382352941176471, "acc_stderr": 0.02584501798692692, "acc_norm": 0.8382352941176471, "acc_norm_stderr": 0.02584501798692692 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.8016877637130801, "acc_stderr": 0.02595502084162113, "acc_norm": 0.8016877637130801, "acc_norm_stderr": 0.02595502084162113 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6995515695067265, "acc_stderr": 0.03076935200822914, "acc_norm": 0.6995515695067265, "acc_norm_stderr": 0.03076935200822914 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.7862595419847328, "acc_stderr": 0.0359546161177469, "acc_norm": 0.7862595419847328, "acc_norm_stderr": 0.0359546161177469 }, "harness|hendrycksTest-international_law|5": { "acc": 0.7851239669421488, "acc_stderr": 0.037494924487096966, "acc_norm": 0.7851239669421488, "acc_norm_stderr": 0.037494924487096966 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7777777777777778, "acc_stderr": 0.0401910747255735, "acc_norm": 0.7777777777777778, "acc_norm_stderr": 0.0401910747255735 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7668711656441718, "acc_stderr": 0.0332201579577674, "acc_norm": 0.7668711656441718, "acc_norm_stderr": 0.0332201579577674 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.48214285714285715, "acc_stderr": 0.047427623612430116, "acc_norm": 0.48214285714285715, "acc_norm_stderr": 0.047427623612430116 }, "harness|hendrycksTest-management|5": { "acc": 0.7864077669902912, "acc_stderr": 0.040580420156460344, "acc_norm": 0.7864077669902912, "acc_norm_stderr": 0.040580420156460344 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8803418803418803, "acc_stderr": 0.021262719400406957, "acc_norm": 0.8803418803418803, "acc_norm_stderr": 0.021262719400406957 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.69, "acc_stderr": 0.04648231987117316, "acc_norm": 0.69, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8275862068965517, "acc_stderr": 0.013507943909371802, "acc_norm": 0.8275862068965517, "acc_norm_stderr": 0.013507943909371802 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.7398843930635838, "acc_stderr": 0.023618678310069367, "acc_norm": 0.7398843930635838, "acc_norm_stderr": 0.023618678310069367 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.4324022346368715, "acc_stderr": 0.01656897123354861, "acc_norm": 0.4324022346368715, "acc_norm_stderr": 0.01656897123354861 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.7254901960784313, "acc_stderr": 0.025553169991826524, "acc_norm": 0.7254901960784313, "acc_norm_stderr": 0.025553169991826524 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.7266881028938906, "acc_stderr": 0.025311765975426122, "acc_norm": 0.7266881028938906, "acc_norm_stderr": 0.025311765975426122 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.7530864197530864, "acc_stderr": 0.023993501709042107, "acc_norm": 0.7530864197530864, "acc_norm_stderr": 0.023993501709042107 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.48936170212765956, "acc_stderr": 0.029820747191422473, "acc_norm": 0.48936170212765956, "acc_norm_stderr": 0.029820747191422473 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.47522816166883963, "acc_stderr": 0.012754553719781753, "acc_norm": 0.47522816166883963, "acc_norm_stderr": 0.012754553719781753 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.6801470588235294, "acc_stderr": 0.028332959514031208, "acc_norm": 0.6801470588235294, "acc_norm_stderr": 0.028332959514031208 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6830065359477124, "acc_stderr": 0.018824219512706207, "acc_norm": 0.6830065359477124, "acc_norm_stderr": 0.018824219512706207 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6818181818181818, "acc_stderr": 0.044612721759105085, "acc_norm": 0.6818181818181818, "acc_norm_stderr": 0.044612721759105085 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.7306122448979592, "acc_stderr": 0.02840125202902294, "acc_norm": 0.7306122448979592, "acc_norm_stderr": 0.02840125202902294 }, "harness|hendrycksTest-sociology|5": { "acc": 0.835820895522388, "acc_stderr": 0.026193923544454125, "acc_norm": 0.835820895522388, "acc_norm_stderr": 0.026193923544454125 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.87, "acc_stderr": 0.033799766898963086, "acc_norm": 0.87, "acc_norm_stderr": 0.033799766898963086 }, "harness|hendrycksTest-virology|5": { "acc": 0.5662650602409639, "acc_stderr": 0.03858158940685516, "acc_norm": 0.5662650602409639, "acc_norm_stderr": 0.03858158940685516 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8421052631578947, "acc_stderr": 0.027966785859160893, "acc_norm": 0.8421052631578947, "acc_norm_stderr": 0.027966785859160893 }, "harness|truthfulqa:mc|0": { "mc1": 0.5740514075887393, "mc1_stderr": 0.01731047190407654, "mc2": 0.7170804043560985, "mc2_stderr": 0.01453935317893205 }, "harness|winogrande|5": { "acc": 0.8200473559589582, "acc_stderr": 0.010796468688068677 }, "harness|gsm8k|5": { "acc": 0.7187263078089462, "acc_stderr": 0.012384789310940244 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_shadowml__WestBeagle-7B
[ "region:us" ]
2024-02-01T21:47:19+00:00
{"pretty_name": "Evaluation run of shadowml/WestBeagle-7B", "dataset_summary": "Dataset automatically created during the evaluation run of model [shadowml/WestBeagle-7B](https://huggingface.co/shadowml/WestBeagle-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_shadowml__WestBeagle-7B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-01T21:44:58.611202](https://huggingface.co/datasets/open-llm-leaderboard/details_shadowml__WestBeagle-7B/blob/main/results_2024-02-01T21-44-58.611202.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6571571064004633,\n \"acc_stderr\": 0.031991367682673535,\n \"acc_norm\": 0.656816938015433,\n \"acc_norm_stderr\": 0.03265435140721583,\n \"mc1\": 0.5740514075887393,\n \"mc1_stderr\": 0.01731047190407654,\n \"mc2\": 0.7170804043560985,\n \"mc2_stderr\": 0.01453935317893205\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6979522184300341,\n \"acc_stderr\": 0.013417519144716417,\n \"acc_norm\": 0.7226962457337884,\n \"acc_norm_stderr\": 0.013082095839059376\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7032463652658832,\n \"acc_stderr\": 0.004558933822995549,\n \"acc_norm\": 0.8828918542123083,\n \"acc_norm_stderr\": 0.0032089195103099334\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.674074074074074,\n \"acc_stderr\": 0.040491220417025055,\n \"acc_norm\": 0.674074074074074,\n \"acc_norm_stderr\": 0.040491220417025055\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6973684210526315,\n \"acc_stderr\": 0.03738520676119669,\n \"acc_norm\": 0.6973684210526315,\n \"acc_norm_stderr\": 0.03738520676119669\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.63,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7018867924528301,\n \"acc_stderr\": 0.028152837942493864,\n \"acc_norm\": 0.7018867924528301,\n \"acc_norm_stderr\": 0.028152837942493864\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7916666666666666,\n \"acc_stderr\": 0.033961162058453336,\n \"acc_norm\": 0.7916666666666666,\n \"acc_norm_stderr\": 0.033961162058453336\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6820809248554913,\n \"acc_stderr\": 0.0355068398916558,\n \"acc_norm\": 0.6820809248554913,\n \"acc_norm_stderr\": 0.0355068398916558\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4411764705882353,\n \"acc_stderr\": 0.049406356306056595,\n \"acc_norm\": 0.4411764705882353,\n \"acc_norm_stderr\": 0.049406356306056595\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.04292346959909284,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.04292346959909284\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5787234042553191,\n \"acc_stderr\": 0.03227834510146268,\n \"acc_norm\": 0.5787234042553191,\n \"acc_norm_stderr\": 0.03227834510146268\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.45614035087719296,\n \"acc_stderr\": 0.046854730419077895,\n \"acc_norm\": 0.45614035087719296,\n \"acc_norm_stderr\": 0.046854730419077895\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5655172413793104,\n \"acc_stderr\": 0.04130740879555498,\n \"acc_norm\": 0.5655172413793104,\n \"acc_norm_stderr\": 0.04130740879555498\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3941798941798942,\n \"acc_stderr\": 0.02516798233389414,\n \"acc_norm\": 0.3941798941798942,\n \"acc_norm_stderr\": 0.02516798233389414\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4603174603174603,\n \"acc_stderr\": 0.04458029125470973,\n \"acc_norm\": 0.4603174603174603,\n \"acc_norm_stderr\": 0.04458029125470973\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7903225806451613,\n \"acc_stderr\": 0.023157879349083525,\n \"acc_norm\": 0.7903225806451613,\n \"acc_norm_stderr\": 0.023157879349083525\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5024630541871922,\n \"acc_stderr\": 0.035179450386910616,\n \"acc_norm\": 0.5024630541871922,\n \"acc_norm_stderr\": 0.035179450386910616\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.03256866661681102,\n \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.03256866661681102\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7828282828282829,\n \"acc_stderr\": 0.029376616484945633,\n \"acc_norm\": 0.7828282828282829,\n \"acc_norm_stderr\": 0.029376616484945633\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9015544041450777,\n \"acc_stderr\": 0.021500249576033456,\n \"acc_norm\": 0.9015544041450777,\n \"acc_norm_stderr\": 0.021500249576033456\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6820512820512821,\n \"acc_stderr\": 0.023610884308927865,\n \"acc_norm\": 0.6820512820512821,\n \"acc_norm_stderr\": 0.023610884308927865\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3296296296296296,\n \"acc_stderr\": 0.02866120111652457,\n \"acc_norm\": 0.3296296296296296,\n \"acc_norm_stderr\": 0.02866120111652457\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6890756302521008,\n \"acc_stderr\": 0.030066761582977934,\n \"acc_norm\": 0.6890756302521008,\n \"acc_norm_stderr\": 0.030066761582977934\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3576158940397351,\n \"acc_stderr\": 0.03913453431177258,\n \"acc_norm\": 0.3576158940397351,\n \"acc_norm_stderr\": 0.03913453431177258\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8458715596330275,\n \"acc_stderr\": 0.015480826865374303,\n \"acc_norm\": 0.8458715596330275,\n \"acc_norm_stderr\": 0.015480826865374303\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5324074074074074,\n \"acc_stderr\": 0.03402801581358966,\n \"acc_norm\": 0.5324074074074074,\n \"acc_norm_stderr\": 0.03402801581358966\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8382352941176471,\n \"acc_stderr\": 0.02584501798692692,\n \"acc_norm\": 0.8382352941176471,\n \"acc_norm_stderr\": 0.02584501798692692\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8016877637130801,\n \"acc_stderr\": 0.02595502084162113,\n \"acc_norm\": 0.8016877637130801,\n \"acc_norm_stderr\": 0.02595502084162113\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6995515695067265,\n \"acc_stderr\": 0.03076935200822914,\n \"acc_norm\": 0.6995515695067265,\n \"acc_norm_stderr\": 0.03076935200822914\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7862595419847328,\n \"acc_stderr\": 0.0359546161177469,\n \"acc_norm\": 0.7862595419847328,\n \"acc_norm_stderr\": 0.0359546161177469\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.0401910747255735,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.0401910747255735\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7668711656441718,\n \"acc_stderr\": 0.0332201579577674,\n \"acc_norm\": 0.7668711656441718,\n \"acc_norm_stderr\": 0.0332201579577674\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.48214285714285715,\n \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.48214285714285715,\n \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7864077669902912,\n \"acc_stderr\": 0.040580420156460344,\n \"acc_norm\": 0.7864077669902912,\n \"acc_norm_stderr\": 0.040580420156460344\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n \"acc_stderr\": 0.021262719400406957,\n \"acc_norm\": 0.8803418803418803,\n \"acc_norm_stderr\": 0.021262719400406957\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8275862068965517,\n \"acc_stderr\": 0.013507943909371802,\n \"acc_norm\": 0.8275862068965517,\n \"acc_norm_stderr\": 0.013507943909371802\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7398843930635838,\n \"acc_stderr\": 0.023618678310069367,\n \"acc_norm\": 0.7398843930635838,\n \"acc_norm_stderr\": 0.023618678310069367\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4324022346368715,\n \"acc_stderr\": 0.01656897123354861,\n \"acc_norm\": 0.4324022346368715,\n \"acc_norm_stderr\": 0.01656897123354861\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7254901960784313,\n \"acc_stderr\": 0.025553169991826524,\n \"acc_norm\": 0.7254901960784313,\n \"acc_norm_stderr\": 0.025553169991826524\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7266881028938906,\n \"acc_stderr\": 0.025311765975426122,\n \"acc_norm\": 0.7266881028938906,\n \"acc_norm_stderr\": 0.025311765975426122\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7530864197530864,\n \"acc_stderr\": 0.023993501709042107,\n \"acc_norm\": 0.7530864197530864,\n \"acc_norm_stderr\": 0.023993501709042107\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.48936170212765956,\n \"acc_stderr\": 0.029820747191422473,\n \"acc_norm\": 0.48936170212765956,\n \"acc_norm_stderr\": 0.029820747191422473\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.47522816166883963,\n \"acc_stderr\": 0.012754553719781753,\n \"acc_norm\": 0.47522816166883963,\n \"acc_norm_stderr\": 0.012754553719781753\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6801470588235294,\n \"acc_stderr\": 0.028332959514031208,\n \"acc_norm\": 0.6801470588235294,\n \"acc_norm_stderr\": 0.028332959514031208\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6830065359477124,\n \"acc_stderr\": 0.018824219512706207,\n \"acc_norm\": 0.6830065359477124,\n \"acc_norm_stderr\": 0.018824219512706207\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n \"acc_stderr\": 0.044612721759105085,\n \"acc_norm\": 0.6818181818181818,\n \"acc_norm_stderr\": 0.044612721759105085\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7306122448979592,\n \"acc_stderr\": 0.02840125202902294,\n \"acc_norm\": 0.7306122448979592,\n \"acc_norm_stderr\": 0.02840125202902294\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n \"acc_stderr\": 0.026193923544454125,\n \"acc_norm\": 0.835820895522388,\n \"acc_norm_stderr\": 0.026193923544454125\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.87,\n \"acc_stderr\": 0.033799766898963086,\n \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.033799766898963086\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5662650602409639,\n \"acc_stderr\": 0.03858158940685516,\n \"acc_norm\": 0.5662650602409639,\n \"acc_norm_stderr\": 0.03858158940685516\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8421052631578947,\n \"acc_stderr\": 0.027966785859160893,\n \"acc_norm\": 0.8421052631578947,\n \"acc_norm_stderr\": 0.027966785859160893\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5740514075887393,\n \"mc1_stderr\": 0.01731047190407654,\n \"mc2\": 0.7170804043560985,\n \"mc2_stderr\": 0.01453935317893205\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8200473559589582,\n \"acc_stderr\": 0.010796468688068677\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7187263078089462,\n \"acc_stderr\": 0.012384789310940244\n }\n}\n```", "repo_url": "https://huggingface.co/shadowml/WestBeagle-7B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_01T21_44_58.611202", "path": ["**/details_harness|arc:challenge|25_2024-02-01T21-44-58.611202.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-01T21-44-58.611202.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_01T21_44_58.611202", "path": ["**/details_harness|gsm8k|5_2024-02-01T21-44-58.611202.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-01T21-44-58.611202.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_01T21_44_58.611202", "path": ["**/details_harness|hellaswag|10_2024-02-01T21-44-58.611202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-01T21-44-58.611202.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_01T21_44_58.611202", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T21-44-58.611202.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-01T21-44-58.611202.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-01T21-44-58.611202.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T21-44-58.611202.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T21-44-58.611202.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-01T21-44-58.611202.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T21-44-58.611202.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T21-44-58.611202.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T21-44-58.611202.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T21-44-58.611202.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-01T21-44-58.611202.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-01T21-44-58.611202.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T21-44-58.611202.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-01T21-44-58.611202.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T21-44-58.611202.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T21-44-58.611202.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T21-44-58.611202.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-01T21-44-58.611202.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T21-44-58.611202.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T21-44-58.611202.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T21-44-58.611202.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T21-44-58.611202.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T21-44-58.611202.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T21-44-58.611202.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T21-44-58.611202.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T21-44-58.611202.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T21-44-58.611202.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T21-44-58.611202.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T21-44-58.611202.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T21-44-58.611202.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T21-44-58.611202.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T21-44-58.611202.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-01T21-44-58.611202.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T21-44-58.611202.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-01T21-44-58.611202.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T21-44-58.611202.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T21-44-58.611202.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T21-44-58.611202.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-01T21-44-58.611202.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-01T21-44-58.611202.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T21-44-58.611202.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T21-44-58.611202.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T21-44-58.611202.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T21-44-58.611202.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-01T21-44-58.611202.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-01T21-44-58.611202.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-01T21-44-58.611202.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T21-44-58.611202.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-01T21-44-58.611202.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T21-44-58.611202.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T21-44-58.611202.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-01T21-44-58.611202.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-01T21-44-58.611202.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-01T21-44-58.611202.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T21-44-58.611202.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-01T21-44-58.611202.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-01T21-44-58.611202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T21-44-58.611202.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-01T21-44-58.611202.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-01T21-44-58.611202.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T21-44-58.611202.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T21-44-58.611202.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-01T21-44-58.611202.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T21-44-58.611202.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T21-44-58.611202.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T21-44-58.611202.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T21-44-58.611202.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-01T21-44-58.611202.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-01T21-44-58.611202.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T21-44-58.611202.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-01T21-44-58.611202.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T21-44-58.611202.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T21-44-58.611202.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T21-44-58.611202.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-01T21-44-58.611202.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T21-44-58.611202.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T21-44-58.611202.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T21-44-58.611202.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T21-44-58.611202.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T21-44-58.611202.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T21-44-58.611202.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T21-44-58.611202.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T21-44-58.611202.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T21-44-58.611202.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T21-44-58.611202.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T21-44-58.611202.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T21-44-58.611202.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T21-44-58.611202.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T21-44-58.611202.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-01T21-44-58.611202.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T21-44-58.611202.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-01T21-44-58.611202.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T21-44-58.611202.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T21-44-58.611202.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T21-44-58.611202.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-01T21-44-58.611202.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-01T21-44-58.611202.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T21-44-58.611202.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T21-44-58.611202.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T21-44-58.611202.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T21-44-58.611202.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-01T21-44-58.611202.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-01T21-44-58.611202.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-01T21-44-58.611202.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T21-44-58.611202.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-01T21-44-58.611202.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T21-44-58.611202.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T21-44-58.611202.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-01T21-44-58.611202.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-01T21-44-58.611202.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-01T21-44-58.611202.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T21-44-58.611202.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-01T21-44-58.611202.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-01T21-44-58.611202.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_01T21_44_58.611202", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T21-44-58.611202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T21-44-58.611202.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_01T21_44_58.611202", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-01T21-44-58.611202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-01T21-44-58.611202.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_01T21_44_58.611202", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-01T21-44-58.611202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-01T21-44-58.611202.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_01T21_44_58.611202", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T21-44-58.611202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T21-44-58.611202.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_01T21_44_58.611202", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T21-44-58.611202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T21-44-58.611202.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_01T21_44_58.611202", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-01T21-44-58.611202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-01T21-44-58.611202.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_01T21_44_58.611202", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T21-44-58.611202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T21-44-58.611202.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_01T21_44_58.611202", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T21-44-58.611202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T21-44-58.611202.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_01T21_44_58.611202", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T21-44-58.611202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T21-44-58.611202.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_01T21_44_58.611202", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T21-44-58.611202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T21-44-58.611202.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_01T21_44_58.611202", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-01T21-44-58.611202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-01T21-44-58.611202.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_01T21_44_58.611202", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-01T21-44-58.611202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-01T21-44-58.611202.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_01T21_44_58.611202", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T21-44-58.611202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T21-44-58.611202.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_01T21_44_58.611202", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-01T21-44-58.611202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-01T21-44-58.611202.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_01T21_44_58.611202", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T21-44-58.611202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T21-44-58.611202.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_01T21_44_58.611202", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T21-44-58.611202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T21-44-58.611202.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_01T21_44_58.611202", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T21-44-58.611202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T21-44-58.611202.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_01T21_44_58.611202", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-01T21-44-58.611202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-01T21-44-58.611202.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_01T21_44_58.611202", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T21-44-58.611202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T21-44-58.611202.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_01T21_44_58.611202", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T21-44-58.611202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T21-44-58.611202.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_01T21_44_58.611202", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T21-44-58.611202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T21-44-58.611202.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_01T21_44_58.611202", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T21-44-58.611202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T21-44-58.611202.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_01T21_44_58.611202", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T21-44-58.611202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T21-44-58.611202.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_01T21_44_58.611202", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T21-44-58.611202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T21-44-58.611202.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_01T21_44_58.611202", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T21-44-58.611202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T21-44-58.611202.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_01T21_44_58.611202", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T21-44-58.611202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T21-44-58.611202.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_01T21_44_58.611202", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T21-44-58.611202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T21-44-58.611202.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_01T21_44_58.611202", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T21-44-58.611202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T21-44-58.611202.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_01T21_44_58.611202", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T21-44-58.611202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T21-44-58.611202.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_01T21_44_58.611202", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T21-44-58.611202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T21-44-58.611202.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_01T21_44_58.611202", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T21-44-58.611202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T21-44-58.611202.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_01T21_44_58.611202", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T21-44-58.611202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T21-44-58.611202.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_01T21_44_58.611202", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-01T21-44-58.611202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-01T21-44-58.611202.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_01T21_44_58.611202", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T21-44-58.611202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T21-44-58.611202.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_01T21_44_58.611202", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-01T21-44-58.611202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-01T21-44-58.611202.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_01T21_44_58.611202", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T21-44-58.611202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T21-44-58.611202.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_01T21_44_58.611202", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T21-44-58.611202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T21-44-58.611202.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_01T21_44_58.611202", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T21-44-58.611202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T21-44-58.611202.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_01T21_44_58.611202", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-01T21-44-58.611202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-01T21-44-58.611202.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_01T21_44_58.611202", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-01T21-44-58.611202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-01T21-44-58.611202.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_01T21_44_58.611202", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T21-44-58.611202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T21-44-58.611202.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_01T21_44_58.611202", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T21-44-58.611202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T21-44-58.611202.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_01T21_44_58.611202", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T21-44-58.611202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T21-44-58.611202.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_01T21_44_58.611202", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T21-44-58.611202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T21-44-58.611202.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_01T21_44_58.611202", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-01T21-44-58.611202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-01T21-44-58.611202.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_01T21_44_58.611202", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-01T21-44-58.611202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-01T21-44-58.611202.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_01T21_44_58.611202", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-01T21-44-58.611202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-01T21-44-58.611202.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_01T21_44_58.611202", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T21-44-58.611202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T21-44-58.611202.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_01T21_44_58.611202", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-01T21-44-58.611202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-01T21-44-58.611202.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_01T21_44_58.611202", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T21-44-58.611202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T21-44-58.611202.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_01T21_44_58.611202", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T21-44-58.611202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T21-44-58.611202.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_01T21_44_58.611202", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-01T21-44-58.611202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-01T21-44-58.611202.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_01T21_44_58.611202", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-01T21-44-58.611202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-01T21-44-58.611202.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_01T21_44_58.611202", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-01T21-44-58.611202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-01T21-44-58.611202.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_01T21_44_58.611202", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T21-44-58.611202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T21-44-58.611202.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_01T21_44_58.611202", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-01T21-44-58.611202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-01T21-44-58.611202.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_01T21_44_58.611202", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-01T21-44-58.611202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-01T21-44-58.611202.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_01T21_44_58.611202", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-01T21-44-58.611202.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-01T21-44-58.611202.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_01T21_44_58.611202", "path": ["**/details_harness|winogrande|5_2024-02-01T21-44-58.611202.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-01T21-44-58.611202.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_01T21_44_58.611202", "path": ["results_2024-02-01T21-44-58.611202.parquet"]}, {"split": "latest", "path": ["results_2024-02-01T21-44-58.611202.parquet"]}]}]}
2024-02-01T21:47:43+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of shadowml/WestBeagle-7B Dataset automatically created during the evaluation run of model shadowml/WestBeagle-7B on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-02-01T21:44:58.611202(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of shadowml/WestBeagle-7B\n\n\n\nDataset automatically created during the evaluation run of model shadowml/WestBeagle-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-02-01T21:44:58.611202(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of shadowml/WestBeagle-7B\n\n\n\nDataset automatically created during the evaluation run of model shadowml/WestBeagle-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-02-01T21:44:58.611202(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
30567f53b9ee593b2a1b888139a52614bb71bcd8
# Dataset Card for Evaluation run of cyberagent/calm2-7b-chat-dpo-experimental <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [cyberagent/calm2-7b-chat-dpo-experimental](https://huggingface.co/cyberagent/calm2-7b-chat-dpo-experimental) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_cyberagent__calm2-7b-chat-dpo-experimental", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-02-01T21:56:29.984219](https://huggingface.co/datasets/open-llm-leaderboard/details_cyberagent__calm2-7b-chat-dpo-experimental/blob/main/results_2024-02-01T21-56-29.984219.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.39856874794500985, "acc_stderr": 0.03441229732329181, "acc_norm": 0.4033694107520943, "acc_norm_stderr": 0.03524214558743911, "mc1": 0.2741738066095471, "mc1_stderr": 0.015616518497219373, "mc2": 0.43126020642044494, "mc2_stderr": 0.01476646245339252 }, "harness|arc:challenge|25": { "acc": 0.3856655290102389, "acc_stderr": 0.014224250973257172, "acc_norm": 0.4104095563139932, "acc_norm_stderr": 0.014374922192642662 }, "harness|hellaswag|10": { "acc": 0.5165305715992831, "acc_stderr": 0.0049870536525402675, "acc_norm": 0.6899024098785103, "acc_norm_stderr": 0.004615880352799746 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.3, "acc_stderr": 0.046056618647183814, "acc_norm": 0.3, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.4666666666666667, "acc_stderr": 0.043097329010363554, "acc_norm": 0.4666666666666667, "acc_norm_stderr": 0.043097329010363554 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.4342105263157895, "acc_stderr": 0.04033565667848319, "acc_norm": 0.4342105263157895, "acc_norm_stderr": 0.04033565667848319 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.42, "acc_stderr": 0.049604496374885836, "acc_norm": 0.42, "acc_norm_stderr": 0.049604496374885836 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.4075471698113208, "acc_stderr": 0.030242233800854494, "acc_norm": 0.4075471698113208, "acc_norm_stderr": 0.030242233800854494 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.3819444444444444, "acc_stderr": 0.040629907841466674, "acc_norm": 0.3819444444444444, "acc_norm_stderr": 0.040629907841466674 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.28, "acc_stderr": 0.04512608598542124, "acc_norm": 0.28, "acc_norm_stderr": 0.04512608598542124 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.32, "acc_stderr": 0.046882617226215034, "acc_norm": 0.32, "acc_norm_stderr": 0.046882617226215034 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.39, "acc_stderr": 0.04902071300001974, "acc_norm": 0.39, "acc_norm_stderr": 0.04902071300001974 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.36416184971098264, "acc_stderr": 0.036690724774169084, "acc_norm": 0.36416184971098264, "acc_norm_stderr": 0.036690724774169084 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.2647058823529412, "acc_stderr": 0.04389869956808779, "acc_norm": 0.2647058823529412, "acc_norm_stderr": 0.04389869956808779 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.47, "acc_stderr": 0.050161355804659205, "acc_norm": 0.47, "acc_norm_stderr": 0.050161355804659205 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.3404255319148936, "acc_stderr": 0.030976692998534436, "acc_norm": 0.3404255319148936, "acc_norm_stderr": 0.030976692998534436 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.2719298245614035, "acc_stderr": 0.04185774424022056, "acc_norm": 0.2719298245614035, "acc_norm_stderr": 0.04185774424022056 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.3586206896551724, "acc_stderr": 0.039966295748767186, "acc_norm": 0.3586206896551724, "acc_norm_stderr": 0.039966295748767186 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.2698412698412698, "acc_stderr": 0.022860838309232072, "acc_norm": 0.2698412698412698, "acc_norm_stderr": 0.022860838309232072 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.2698412698412698, "acc_stderr": 0.03970158273235173, "acc_norm": 0.2698412698412698, "acc_norm_stderr": 0.03970158273235173 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.38, "acc_stderr": 0.04878317312145632, "acc_norm": 0.38, "acc_norm_stderr": 0.04878317312145632 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.3870967741935484, "acc_stderr": 0.02770935967503249, "acc_norm": 0.3870967741935484, "acc_norm_stderr": 0.02770935967503249 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.3251231527093596, "acc_stderr": 0.032957975663112704, "acc_norm": 0.3251231527093596, "acc_norm_stderr": 0.032957975663112704 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.37, "acc_stderr": 0.04852365870939099, "acc_norm": 0.37, "acc_norm_stderr": 0.04852365870939099 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.48484848484848486, "acc_stderr": 0.03902551007374448, "acc_norm": 0.48484848484848486, "acc_norm_stderr": 0.03902551007374448 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.48484848484848486, "acc_stderr": 0.03560716516531061, "acc_norm": 0.48484848484848486, "acc_norm_stderr": 0.03560716516531061 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.5129533678756477, "acc_stderr": 0.036072280610477486, "acc_norm": 0.5129533678756477, "acc_norm_stderr": 0.036072280610477486 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.33076923076923076, "acc_stderr": 0.02385479568097114, "acc_norm": 0.33076923076923076, "acc_norm_stderr": 0.02385479568097114 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.25555555555555554, "acc_stderr": 0.026593939101844082, "acc_norm": 0.25555555555555554, "acc_norm_stderr": 0.026593939101844082 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.3067226890756303, "acc_stderr": 0.02995382389188705, "acc_norm": 0.3067226890756303, "acc_norm_stderr": 0.02995382389188705 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.31125827814569534, "acc_stderr": 0.03780445850526732, "acc_norm": 0.31125827814569534, "acc_norm_stderr": 0.03780445850526732 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.46605504587155966, "acc_stderr": 0.021387863350353996, "acc_norm": 0.46605504587155966, "acc_norm_stderr": 0.021387863350353996 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.26851851851851855, "acc_stderr": 0.030225226160012383, "acc_norm": 0.26851851851851855, "acc_norm_stderr": 0.030225226160012383 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.46078431372549017, "acc_stderr": 0.03498501649369527, "acc_norm": 0.46078431372549017, "acc_norm_stderr": 0.03498501649369527 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.510548523206751, "acc_stderr": 0.032539983791662855, "acc_norm": 0.510548523206751, "acc_norm_stderr": 0.032539983791662855 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.4484304932735426, "acc_stderr": 0.033378837362550984, "acc_norm": 0.4484304932735426, "acc_norm_stderr": 0.033378837362550984 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.40458015267175573, "acc_stderr": 0.043046937953806645, "acc_norm": 0.40458015267175573, "acc_norm_stderr": 0.043046937953806645 }, "harness|hendrycksTest-international_law|5": { "acc": 0.5371900826446281, "acc_stderr": 0.04551711196104218, "acc_norm": 0.5371900826446281, "acc_norm_stderr": 0.04551711196104218 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.4722222222222222, "acc_stderr": 0.04826217294139894, "acc_norm": 0.4722222222222222, "acc_norm_stderr": 0.04826217294139894 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.3803680981595092, "acc_stderr": 0.03814269893261836, "acc_norm": 0.3803680981595092, "acc_norm_stderr": 0.03814269893261836 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.33035714285714285, "acc_stderr": 0.04464285714285713, "acc_norm": 0.33035714285714285, "acc_norm_stderr": 0.04464285714285713 }, "harness|hendrycksTest-management|5": { "acc": 0.3883495145631068, "acc_stderr": 0.0482572933735639, "acc_norm": 0.3883495145631068, "acc_norm_stderr": 0.0482572933735639 }, "harness|hendrycksTest-marketing|5": { "acc": 0.5, "acc_stderr": 0.03275608910402091, "acc_norm": 0.5, "acc_norm_stderr": 0.03275608910402091 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.48, "acc_stderr": 0.05021167315686781, "acc_norm": 0.48, "acc_norm_stderr": 0.05021167315686781 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.508301404853129, "acc_stderr": 0.017877498991072008, "acc_norm": 0.508301404853129, "acc_norm_stderr": 0.017877498991072008 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.41329479768786126, "acc_stderr": 0.026511261369409247, "acc_norm": 0.41329479768786126, "acc_norm_stderr": 0.026511261369409247 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.2837988826815642, "acc_stderr": 0.015078358970751774, "acc_norm": 0.2837988826815642, "acc_norm_stderr": 0.015078358970751774 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.4150326797385621, "acc_stderr": 0.028213504177824103, "acc_norm": 0.4150326797385621, "acc_norm_stderr": 0.028213504177824103 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.4115755627009646, "acc_stderr": 0.02795048149440126, "acc_norm": 0.4115755627009646, "acc_norm_stderr": 0.02795048149440126 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.44135802469135804, "acc_stderr": 0.02762873715566878, "acc_norm": 0.44135802469135804, "acc_norm_stderr": 0.02762873715566878 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.30851063829787234, "acc_stderr": 0.027553366165101373, "acc_norm": 0.30851063829787234, "acc_norm_stderr": 0.027553366165101373 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.3344198174706649, "acc_stderr": 0.012049668983214934, "acc_norm": 0.3344198174706649, "acc_norm_stderr": 0.012049668983214934 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.41911764705882354, "acc_stderr": 0.029972807170464622, "acc_norm": 0.41911764705882354, "acc_norm_stderr": 0.029972807170464622 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.3627450980392157, "acc_stderr": 0.019450768432505518, "acc_norm": 0.3627450980392157, "acc_norm_stderr": 0.019450768432505518 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.43636363636363634, "acc_stderr": 0.04750185058907297, "acc_norm": 0.43636363636363634, "acc_norm_stderr": 0.04750185058907297 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.4775510204081633, "acc_stderr": 0.031976941187136725, "acc_norm": 0.4775510204081633, "acc_norm_stderr": 0.031976941187136725 }, "harness|hendrycksTest-sociology|5": { "acc": 0.5422885572139303, "acc_stderr": 0.035228658640995975, "acc_norm": 0.5422885572139303, "acc_norm_stderr": 0.035228658640995975 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.59, "acc_stderr": 0.04943110704237101, "acc_norm": 0.59, "acc_norm_stderr": 0.04943110704237101 }, "harness|hendrycksTest-virology|5": { "acc": 0.40963855421686746, "acc_stderr": 0.038284011150790206, "acc_norm": 0.40963855421686746, "acc_norm_stderr": 0.038284011150790206 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.5380116959064327, "acc_stderr": 0.03823727092882307, "acc_norm": 0.5380116959064327, "acc_norm_stderr": 0.03823727092882307 }, "harness|truthfulqa:mc|0": { "mc1": 0.2741738066095471, "mc1_stderr": 0.015616518497219373, "mc2": 0.43126020642044494, "mc2_stderr": 0.01476646245339252 }, "harness|winogrande|5": { "acc": 0.6566692975532754, "acc_stderr": 0.013344823185358004 }, "harness|gsm8k|5": { "acc": 0.05534495830174375, "acc_stderr": 0.006298221796179574 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_cyberagent__calm2-7b-chat-dpo-experimental
[ "region:us" ]
2024-02-01T21:58:45+00:00
{"pretty_name": "Evaluation run of cyberagent/calm2-7b-chat-dpo-experimental", "dataset_summary": "Dataset automatically created during the evaluation run of model [cyberagent/calm2-7b-chat-dpo-experimental](https://huggingface.co/cyberagent/calm2-7b-chat-dpo-experimental) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_cyberagent__calm2-7b-chat-dpo-experimental\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-01T21:56:29.984219](https://huggingface.co/datasets/open-llm-leaderboard/details_cyberagent__calm2-7b-chat-dpo-experimental/blob/main/results_2024-02-01T21-56-29.984219.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.39856874794500985,\n \"acc_stderr\": 0.03441229732329181,\n \"acc_norm\": 0.4033694107520943,\n \"acc_norm_stderr\": 0.03524214558743911,\n \"mc1\": 0.2741738066095471,\n \"mc1_stderr\": 0.015616518497219373,\n \"mc2\": 0.43126020642044494,\n \"mc2_stderr\": 0.01476646245339252\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.3856655290102389,\n \"acc_stderr\": 0.014224250973257172,\n \"acc_norm\": 0.4104095563139932,\n \"acc_norm_stderr\": 0.014374922192642662\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5165305715992831,\n \"acc_stderr\": 0.0049870536525402675,\n \"acc_norm\": 0.6899024098785103,\n \"acc_norm_stderr\": 0.004615880352799746\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4666666666666667,\n \"acc_stderr\": 0.043097329010363554,\n \"acc_norm\": 0.4666666666666667,\n \"acc_norm_stderr\": 0.043097329010363554\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.4342105263157895,\n \"acc_stderr\": 0.04033565667848319,\n \"acc_norm\": 0.4342105263157895,\n \"acc_norm_stderr\": 0.04033565667848319\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.4075471698113208,\n \"acc_stderr\": 0.030242233800854494,\n \"acc_norm\": 0.4075471698113208,\n \"acc_norm_stderr\": 0.030242233800854494\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.3819444444444444,\n \"acc_stderr\": 0.040629907841466674,\n \"acc_norm\": 0.3819444444444444,\n \"acc_norm_stderr\": 0.040629907841466674\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542124,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542124\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001974,\n \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001974\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.36416184971098264,\n \"acc_stderr\": 0.036690724774169084,\n \"acc_norm\": 0.36416184971098264,\n \"acc_norm_stderr\": 0.036690724774169084\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.2647058823529412,\n \"acc_stderr\": 0.04389869956808779,\n \"acc_norm\": 0.2647058823529412,\n \"acc_norm_stderr\": 0.04389869956808779\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.3404255319148936,\n \"acc_stderr\": 0.030976692998534436,\n \"acc_norm\": 0.3404255319148936,\n \"acc_norm_stderr\": 0.030976692998534436\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2719298245614035,\n \"acc_stderr\": 0.04185774424022056,\n \"acc_norm\": 0.2719298245614035,\n \"acc_norm_stderr\": 0.04185774424022056\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.3586206896551724,\n \"acc_stderr\": 0.039966295748767186,\n \"acc_norm\": 0.3586206896551724,\n \"acc_norm_stderr\": 0.039966295748767186\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.2698412698412698,\n \"acc_stderr\": 0.022860838309232072,\n \"acc_norm\": 0.2698412698412698,\n \"acc_norm_stderr\": 0.022860838309232072\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2698412698412698,\n \"acc_stderr\": 0.03970158273235173,\n \"acc_norm\": 0.2698412698412698,\n \"acc_norm_stderr\": 0.03970158273235173\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145632,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145632\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.3870967741935484,\n \"acc_stderr\": 0.02770935967503249,\n \"acc_norm\": 0.3870967741935484,\n \"acc_norm_stderr\": 0.02770935967503249\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.3251231527093596,\n \"acc_stderr\": 0.032957975663112704,\n \"acc_norm\": 0.3251231527093596,\n \"acc_norm_stderr\": 0.032957975663112704\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.48484848484848486,\n \"acc_stderr\": 0.03902551007374448,\n \"acc_norm\": 0.48484848484848486,\n \"acc_norm_stderr\": 0.03902551007374448\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.48484848484848486,\n \"acc_stderr\": 0.03560716516531061,\n \"acc_norm\": 0.48484848484848486,\n \"acc_norm_stderr\": 0.03560716516531061\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.5129533678756477,\n \"acc_stderr\": 0.036072280610477486,\n \"acc_norm\": 0.5129533678756477,\n \"acc_norm_stderr\": 0.036072280610477486\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.33076923076923076,\n \"acc_stderr\": 0.02385479568097114,\n \"acc_norm\": 0.33076923076923076,\n \"acc_norm_stderr\": 0.02385479568097114\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.25555555555555554,\n \"acc_stderr\": 0.026593939101844082,\n \"acc_norm\": 0.25555555555555554,\n \"acc_norm_stderr\": 0.026593939101844082\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.3067226890756303,\n \"acc_stderr\": 0.02995382389188705,\n \"acc_norm\": 0.3067226890756303,\n \"acc_norm_stderr\": 0.02995382389188705\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.31125827814569534,\n \"acc_stderr\": 0.03780445850526732,\n \"acc_norm\": 0.31125827814569534,\n \"acc_norm_stderr\": 0.03780445850526732\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.46605504587155966,\n \"acc_stderr\": 0.021387863350353996,\n \"acc_norm\": 0.46605504587155966,\n \"acc_norm_stderr\": 0.021387863350353996\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.26851851851851855,\n \"acc_stderr\": 0.030225226160012383,\n \"acc_norm\": 0.26851851851851855,\n \"acc_norm_stderr\": 0.030225226160012383\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.46078431372549017,\n \"acc_stderr\": 0.03498501649369527,\n \"acc_norm\": 0.46078431372549017,\n \"acc_norm_stderr\": 0.03498501649369527\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.510548523206751,\n \"acc_stderr\": 0.032539983791662855,\n \"acc_norm\": 0.510548523206751,\n \"acc_norm_stderr\": 0.032539983791662855\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.4484304932735426,\n \"acc_stderr\": 0.033378837362550984,\n \"acc_norm\": 0.4484304932735426,\n \"acc_norm_stderr\": 0.033378837362550984\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.40458015267175573,\n \"acc_stderr\": 0.043046937953806645,\n \"acc_norm\": 0.40458015267175573,\n \"acc_norm_stderr\": 0.043046937953806645\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.5371900826446281,\n \"acc_stderr\": 0.04551711196104218,\n \"acc_norm\": 0.5371900826446281,\n \"acc_norm_stderr\": 0.04551711196104218\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.4722222222222222,\n \"acc_stderr\": 0.04826217294139894,\n \"acc_norm\": 0.4722222222222222,\n \"acc_norm_stderr\": 0.04826217294139894\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.3803680981595092,\n \"acc_stderr\": 0.03814269893261836,\n \"acc_norm\": 0.3803680981595092,\n \"acc_norm_stderr\": 0.03814269893261836\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.33035714285714285,\n \"acc_stderr\": 0.04464285714285713,\n \"acc_norm\": 0.33035714285714285,\n \"acc_norm_stderr\": 0.04464285714285713\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.3883495145631068,\n \"acc_stderr\": 0.0482572933735639,\n \"acc_norm\": 0.3883495145631068,\n \"acc_norm_stderr\": 0.0482572933735639\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.03275608910402091,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.03275608910402091\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.48,\n \"acc_stderr\": 0.05021167315686781,\n \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.05021167315686781\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.508301404853129,\n \"acc_stderr\": 0.017877498991072008,\n \"acc_norm\": 0.508301404853129,\n \"acc_norm_stderr\": 0.017877498991072008\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.41329479768786126,\n \"acc_stderr\": 0.026511261369409247,\n \"acc_norm\": 0.41329479768786126,\n \"acc_norm_stderr\": 0.026511261369409247\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2837988826815642,\n \"acc_stderr\": 0.015078358970751774,\n \"acc_norm\": 0.2837988826815642,\n \"acc_norm_stderr\": 0.015078358970751774\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.4150326797385621,\n \"acc_stderr\": 0.028213504177824103,\n \"acc_norm\": 0.4150326797385621,\n \"acc_norm_stderr\": 0.028213504177824103\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.4115755627009646,\n \"acc_stderr\": 0.02795048149440126,\n \"acc_norm\": 0.4115755627009646,\n \"acc_norm_stderr\": 0.02795048149440126\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.44135802469135804,\n \"acc_stderr\": 0.02762873715566878,\n \"acc_norm\": 0.44135802469135804,\n \"acc_norm_stderr\": 0.02762873715566878\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.30851063829787234,\n \"acc_stderr\": 0.027553366165101373,\n \"acc_norm\": 0.30851063829787234,\n \"acc_norm_stderr\": 0.027553366165101373\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3344198174706649,\n \"acc_stderr\": 0.012049668983214934,\n \"acc_norm\": 0.3344198174706649,\n \"acc_norm_stderr\": 0.012049668983214934\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.41911764705882354,\n \"acc_stderr\": 0.029972807170464622,\n \"acc_norm\": 0.41911764705882354,\n \"acc_norm_stderr\": 0.029972807170464622\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.3627450980392157,\n \"acc_stderr\": 0.019450768432505518,\n \"acc_norm\": 0.3627450980392157,\n \"acc_norm_stderr\": 0.019450768432505518\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.43636363636363634,\n \"acc_stderr\": 0.04750185058907297,\n \"acc_norm\": 0.43636363636363634,\n \"acc_norm_stderr\": 0.04750185058907297\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.4775510204081633,\n \"acc_stderr\": 0.031976941187136725,\n \"acc_norm\": 0.4775510204081633,\n \"acc_norm_stderr\": 0.031976941187136725\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.5422885572139303,\n \"acc_stderr\": 0.035228658640995975,\n \"acc_norm\": 0.5422885572139303,\n \"acc_norm_stderr\": 0.035228658640995975\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.59,\n \"acc_stderr\": 0.04943110704237101,\n \"acc_norm\": 0.59,\n \"acc_norm_stderr\": 0.04943110704237101\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.40963855421686746,\n \"acc_stderr\": 0.038284011150790206,\n \"acc_norm\": 0.40963855421686746,\n \"acc_norm_stderr\": 0.038284011150790206\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.5380116959064327,\n \"acc_stderr\": 0.03823727092882307,\n \"acc_norm\": 0.5380116959064327,\n \"acc_norm_stderr\": 0.03823727092882307\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2741738066095471,\n \"mc1_stderr\": 0.015616518497219373,\n \"mc2\": 0.43126020642044494,\n \"mc2_stderr\": 0.01476646245339252\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.6566692975532754,\n \"acc_stderr\": 0.013344823185358004\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.05534495830174375,\n \"acc_stderr\": 0.006298221796179574\n }\n}\n```", "repo_url": "https://huggingface.co/cyberagent/calm2-7b-chat-dpo-experimental", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_01T21_56_29.984219", "path": ["**/details_harness|arc:challenge|25_2024-02-01T21-56-29.984219.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-01T21-56-29.984219.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_01T21_56_29.984219", "path": ["**/details_harness|gsm8k|5_2024-02-01T21-56-29.984219.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-01T21-56-29.984219.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_01T21_56_29.984219", "path": ["**/details_harness|hellaswag|10_2024-02-01T21-56-29.984219.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-01T21-56-29.984219.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_01T21_56_29.984219", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T21-56-29.984219.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-01T21-56-29.984219.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-01T21-56-29.984219.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T21-56-29.984219.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T21-56-29.984219.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-01T21-56-29.984219.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T21-56-29.984219.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T21-56-29.984219.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T21-56-29.984219.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T21-56-29.984219.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-01T21-56-29.984219.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-01T21-56-29.984219.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T21-56-29.984219.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-01T21-56-29.984219.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T21-56-29.984219.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T21-56-29.984219.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T21-56-29.984219.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-01T21-56-29.984219.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T21-56-29.984219.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T21-56-29.984219.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T21-56-29.984219.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T21-56-29.984219.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T21-56-29.984219.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T21-56-29.984219.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T21-56-29.984219.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T21-56-29.984219.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T21-56-29.984219.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T21-56-29.984219.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T21-56-29.984219.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T21-56-29.984219.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T21-56-29.984219.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T21-56-29.984219.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-01T21-56-29.984219.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T21-56-29.984219.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-01T21-56-29.984219.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T21-56-29.984219.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T21-56-29.984219.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T21-56-29.984219.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-01T21-56-29.984219.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-01T21-56-29.984219.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T21-56-29.984219.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T21-56-29.984219.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T21-56-29.984219.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T21-56-29.984219.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-01T21-56-29.984219.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-01T21-56-29.984219.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-01T21-56-29.984219.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T21-56-29.984219.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-01T21-56-29.984219.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T21-56-29.984219.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T21-56-29.984219.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-01T21-56-29.984219.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-01T21-56-29.984219.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-01T21-56-29.984219.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T21-56-29.984219.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-01T21-56-29.984219.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-01T21-56-29.984219.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T21-56-29.984219.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-01T21-56-29.984219.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-01T21-56-29.984219.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T21-56-29.984219.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T21-56-29.984219.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-01T21-56-29.984219.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T21-56-29.984219.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T21-56-29.984219.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T21-56-29.984219.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T21-56-29.984219.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-01T21-56-29.984219.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-01T21-56-29.984219.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T21-56-29.984219.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-01T21-56-29.984219.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T21-56-29.984219.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T21-56-29.984219.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T21-56-29.984219.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-01T21-56-29.984219.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T21-56-29.984219.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T21-56-29.984219.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T21-56-29.984219.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T21-56-29.984219.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T21-56-29.984219.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T21-56-29.984219.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T21-56-29.984219.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T21-56-29.984219.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T21-56-29.984219.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T21-56-29.984219.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T21-56-29.984219.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T21-56-29.984219.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T21-56-29.984219.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T21-56-29.984219.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-01T21-56-29.984219.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T21-56-29.984219.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-01T21-56-29.984219.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T21-56-29.984219.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T21-56-29.984219.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T21-56-29.984219.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-01T21-56-29.984219.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-01T21-56-29.984219.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T21-56-29.984219.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T21-56-29.984219.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T21-56-29.984219.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T21-56-29.984219.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-01T21-56-29.984219.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-01T21-56-29.984219.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-01T21-56-29.984219.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T21-56-29.984219.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-01T21-56-29.984219.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T21-56-29.984219.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T21-56-29.984219.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-01T21-56-29.984219.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-01T21-56-29.984219.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-01T21-56-29.984219.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T21-56-29.984219.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-01T21-56-29.984219.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-01T21-56-29.984219.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_01T21_56_29.984219", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T21-56-29.984219.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T21-56-29.984219.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_01T21_56_29.984219", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-01T21-56-29.984219.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-01T21-56-29.984219.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_01T21_56_29.984219", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-01T21-56-29.984219.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-01T21-56-29.984219.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_01T21_56_29.984219", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T21-56-29.984219.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T21-56-29.984219.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_01T21_56_29.984219", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T21-56-29.984219.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T21-56-29.984219.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_01T21_56_29.984219", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-01T21-56-29.984219.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-01T21-56-29.984219.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_01T21_56_29.984219", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T21-56-29.984219.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T21-56-29.984219.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_01T21_56_29.984219", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T21-56-29.984219.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T21-56-29.984219.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_01T21_56_29.984219", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T21-56-29.984219.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T21-56-29.984219.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_01T21_56_29.984219", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T21-56-29.984219.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T21-56-29.984219.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_01T21_56_29.984219", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-01T21-56-29.984219.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-01T21-56-29.984219.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_01T21_56_29.984219", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-01T21-56-29.984219.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-01T21-56-29.984219.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_01T21_56_29.984219", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T21-56-29.984219.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T21-56-29.984219.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_01T21_56_29.984219", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-01T21-56-29.984219.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-01T21-56-29.984219.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_01T21_56_29.984219", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T21-56-29.984219.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T21-56-29.984219.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_01T21_56_29.984219", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T21-56-29.984219.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T21-56-29.984219.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_01T21_56_29.984219", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T21-56-29.984219.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T21-56-29.984219.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_01T21_56_29.984219", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-01T21-56-29.984219.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-01T21-56-29.984219.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_01T21_56_29.984219", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T21-56-29.984219.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T21-56-29.984219.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_01T21_56_29.984219", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T21-56-29.984219.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T21-56-29.984219.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_01T21_56_29.984219", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T21-56-29.984219.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T21-56-29.984219.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_01T21_56_29.984219", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T21-56-29.984219.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T21-56-29.984219.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_01T21_56_29.984219", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T21-56-29.984219.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T21-56-29.984219.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_01T21_56_29.984219", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T21-56-29.984219.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T21-56-29.984219.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_01T21_56_29.984219", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T21-56-29.984219.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T21-56-29.984219.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_01T21_56_29.984219", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T21-56-29.984219.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T21-56-29.984219.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_01T21_56_29.984219", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T21-56-29.984219.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T21-56-29.984219.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_01T21_56_29.984219", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T21-56-29.984219.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T21-56-29.984219.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_01T21_56_29.984219", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T21-56-29.984219.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T21-56-29.984219.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_01T21_56_29.984219", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T21-56-29.984219.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T21-56-29.984219.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_01T21_56_29.984219", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T21-56-29.984219.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T21-56-29.984219.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_01T21_56_29.984219", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T21-56-29.984219.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T21-56-29.984219.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_01T21_56_29.984219", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-01T21-56-29.984219.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-01T21-56-29.984219.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_01T21_56_29.984219", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T21-56-29.984219.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T21-56-29.984219.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_01T21_56_29.984219", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-01T21-56-29.984219.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-01T21-56-29.984219.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_01T21_56_29.984219", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T21-56-29.984219.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T21-56-29.984219.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_01T21_56_29.984219", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T21-56-29.984219.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T21-56-29.984219.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_01T21_56_29.984219", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T21-56-29.984219.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T21-56-29.984219.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_01T21_56_29.984219", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-01T21-56-29.984219.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-01T21-56-29.984219.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_01T21_56_29.984219", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-01T21-56-29.984219.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-01T21-56-29.984219.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_01T21_56_29.984219", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T21-56-29.984219.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T21-56-29.984219.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_01T21_56_29.984219", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T21-56-29.984219.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T21-56-29.984219.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_01T21_56_29.984219", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T21-56-29.984219.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T21-56-29.984219.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_01T21_56_29.984219", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T21-56-29.984219.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T21-56-29.984219.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_01T21_56_29.984219", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-01T21-56-29.984219.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-01T21-56-29.984219.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_01T21_56_29.984219", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-01T21-56-29.984219.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-01T21-56-29.984219.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_01T21_56_29.984219", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-01T21-56-29.984219.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-01T21-56-29.984219.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_01T21_56_29.984219", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T21-56-29.984219.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T21-56-29.984219.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_01T21_56_29.984219", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-01T21-56-29.984219.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-01T21-56-29.984219.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_01T21_56_29.984219", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T21-56-29.984219.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T21-56-29.984219.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_01T21_56_29.984219", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T21-56-29.984219.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T21-56-29.984219.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_01T21_56_29.984219", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-01T21-56-29.984219.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-01T21-56-29.984219.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_01T21_56_29.984219", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-01T21-56-29.984219.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-01T21-56-29.984219.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_01T21_56_29.984219", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-01T21-56-29.984219.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-01T21-56-29.984219.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_01T21_56_29.984219", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T21-56-29.984219.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T21-56-29.984219.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_01T21_56_29.984219", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-01T21-56-29.984219.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-01T21-56-29.984219.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_01T21_56_29.984219", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-01T21-56-29.984219.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-01T21-56-29.984219.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_01T21_56_29.984219", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-01T21-56-29.984219.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-01T21-56-29.984219.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_01T21_56_29.984219", "path": ["**/details_harness|winogrande|5_2024-02-01T21-56-29.984219.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-01T21-56-29.984219.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_01T21_56_29.984219", "path": ["results_2024-02-01T21-56-29.984219.parquet"]}, {"split": "latest", "path": ["results_2024-02-01T21-56-29.984219.parquet"]}]}]}
2024-02-01T21:59:08+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of cyberagent/calm2-7b-chat-dpo-experimental Dataset automatically created during the evaluation run of model cyberagent/calm2-7b-chat-dpo-experimental on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-02-01T21:56:29.984219(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of cyberagent/calm2-7b-chat-dpo-experimental\n\n\n\nDataset automatically created during the evaluation run of model cyberagent/calm2-7b-chat-dpo-experimental on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-02-01T21:56:29.984219(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of cyberagent/calm2-7b-chat-dpo-experimental\n\n\n\nDataset automatically created during the evaluation run of model cyberagent/calm2-7b-chat-dpo-experimental on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-02-01T21:56:29.984219(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
276b85d27bae9331e50187bd09bfdd2a65a2e745
# Dataset Card for Evaluation run of PotatoOff/Michel-13B <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [PotatoOff/Michel-13B](https://huggingface.co/PotatoOff/Michel-13B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_PotatoOff__Michel-13B", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-02-01T22:05:33.263550](https://huggingface.co/datasets/open-llm-leaderboard/details_PotatoOff__Michel-13B/blob/main/results_2024-02-01T22-05-33.263550.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.5499141517864713, "acc_stderr": 0.034037566570250866, "acc_norm": 0.556324343200096, "acc_norm_stderr": 0.03477678629039932, "mc1": 0.35006119951040393, "mc1_stderr": 0.01669794942015103, "mc2": 0.5043477199409111, "mc2_stderr": 0.015764099492460493 }, "harness|arc:challenge|25": { "acc": 0.5767918088737202, "acc_stderr": 0.014438036220848034, "acc_norm": 0.6126279863481229, "acc_norm_stderr": 0.01423587248790987 }, "harness|hellaswag|10": { "acc": 0.6357299342760406, "acc_stderr": 0.004802413919932666, "acc_norm": 0.832105158334993, "acc_norm_stderr": 0.0037300899105375796 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.35, "acc_stderr": 0.0479372485441102, "acc_norm": 0.35, "acc_norm_stderr": 0.0479372485441102 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.4666666666666667, "acc_stderr": 0.043097329010363554, "acc_norm": 0.4666666666666667, "acc_norm_stderr": 0.043097329010363554 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.5592105263157895, "acc_stderr": 0.04040311062490436, "acc_norm": 0.5592105263157895, "acc_norm_stderr": 0.04040311062490436 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.6, "acc_stderr": 0.04923659639173309, "acc_norm": 0.6, "acc_norm_stderr": 0.04923659639173309 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.5584905660377358, "acc_stderr": 0.030561590426731837, "acc_norm": 0.5584905660377358, "acc_norm_stderr": 0.030561590426731837 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.6180555555555556, "acc_stderr": 0.040629907841466674, "acc_norm": 0.6180555555555556, "acc_norm_stderr": 0.040629907841466674 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.45, "acc_stderr": 0.04999999999999999, "acc_norm": 0.45, "acc_norm_stderr": 0.04999999999999999 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.46, "acc_stderr": 0.05009082659620332, "acc_norm": 0.46, "acc_norm_stderr": 0.05009082659620332 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.34, "acc_stderr": 0.04760952285695235, "acc_norm": 0.34, "acc_norm_stderr": 0.04760952285695235 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.4913294797687861, "acc_stderr": 0.038118909889404126, "acc_norm": 0.4913294797687861, "acc_norm_stderr": 0.038118909889404126 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.2549019607843137, "acc_stderr": 0.043364327079931785, "acc_norm": 0.2549019607843137, "acc_norm_stderr": 0.043364327079931785 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.66, "acc_stderr": 0.04760952285695237, "acc_norm": 0.66, "acc_norm_stderr": 0.04760952285695237 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.46808510638297873, "acc_stderr": 0.03261936918467382, "acc_norm": 0.46808510638297873, "acc_norm_stderr": 0.03261936918467382 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.3157894736842105, "acc_stderr": 0.043727482902780064, "acc_norm": 0.3157894736842105, "acc_norm_stderr": 0.043727482902780064 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.45517241379310347, "acc_stderr": 0.04149886942192117, "acc_norm": 0.45517241379310347, "acc_norm_stderr": 0.04149886942192117 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.35714285714285715, "acc_stderr": 0.024677862841332783, "acc_norm": 0.35714285714285715, "acc_norm_stderr": 0.024677862841332783 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.36507936507936506, "acc_stderr": 0.04306241259127153, "acc_norm": 0.36507936507936506, "acc_norm_stderr": 0.04306241259127153 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.38, "acc_stderr": 0.04878317312145633, "acc_norm": 0.38, "acc_norm_stderr": 0.04878317312145633 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.6258064516129033, "acc_stderr": 0.0275289042998457, "acc_norm": 0.6258064516129033, "acc_norm_stderr": 0.0275289042998457 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.42857142857142855, "acc_stderr": 0.03481904844438803, "acc_norm": 0.42857142857142855, "acc_norm_stderr": 0.03481904844438803 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.56, "acc_stderr": 0.04988876515698589, "acc_norm": 0.56, "acc_norm_stderr": 0.04988876515698589 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.6121212121212121, "acc_stderr": 0.038049136539710114, "acc_norm": 0.6121212121212121, "acc_norm_stderr": 0.038049136539710114 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.6868686868686869, "acc_stderr": 0.033042050878136525, "acc_norm": 0.6868686868686869, "acc_norm_stderr": 0.033042050878136525 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.7772020725388601, "acc_stderr": 0.030031147977641538, "acc_norm": 0.7772020725388601, "acc_norm_stderr": 0.030031147977641538 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.4846153846153846, "acc_stderr": 0.025339003010106515, "acc_norm": 0.4846153846153846, "acc_norm_stderr": 0.025339003010106515 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.337037037037037, "acc_stderr": 0.028820884666253252, "acc_norm": 0.337037037037037, "acc_norm_stderr": 0.028820884666253252 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.5672268907563025, "acc_stderr": 0.03218358107742613, "acc_norm": 0.5672268907563025, "acc_norm_stderr": 0.03218358107742613 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.3841059602649007, "acc_stderr": 0.03971301814719197, "acc_norm": 0.3841059602649007, "acc_norm_stderr": 0.03971301814719197 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.7247706422018348, "acc_stderr": 0.019149093743155203, "acc_norm": 0.7247706422018348, "acc_norm_stderr": 0.019149093743155203 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.48148148148148145, "acc_stderr": 0.03407632093854053, "acc_norm": 0.48148148148148145, "acc_norm_stderr": 0.03407632093854053 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.7450980392156863, "acc_stderr": 0.030587591351604243, "acc_norm": 0.7450980392156863, "acc_norm_stderr": 0.030587591351604243 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.7172995780590717, "acc_stderr": 0.02931281415395593, "acc_norm": 0.7172995780590717, "acc_norm_stderr": 0.02931281415395593 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6233183856502242, "acc_stderr": 0.03252113489929188, "acc_norm": 0.6233183856502242, "acc_norm_stderr": 0.03252113489929188 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.5801526717557252, "acc_stderr": 0.043285772152629715, "acc_norm": 0.5801526717557252, "acc_norm_stderr": 0.043285772152629715 }, "harness|hendrycksTest-international_law|5": { "acc": 0.7272727272727273, "acc_stderr": 0.04065578140908706, "acc_norm": 0.7272727272727273, "acc_norm_stderr": 0.04065578140908706 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.6851851851851852, "acc_stderr": 0.04489931073591312, "acc_norm": 0.6851851851851852, "acc_norm_stderr": 0.04489931073591312 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.6625766871165644, "acc_stderr": 0.03714908409935574, "acc_norm": 0.6625766871165644, "acc_norm_stderr": 0.03714908409935574 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.39285714285714285, "acc_stderr": 0.046355501356099754, "acc_norm": 0.39285714285714285, "acc_norm_stderr": 0.046355501356099754 }, "harness|hendrycksTest-management|5": { "acc": 0.7184466019417476, "acc_stderr": 0.044532548363264673, "acc_norm": 0.7184466019417476, "acc_norm_stderr": 0.044532548363264673 }, "harness|hendrycksTest-marketing|5": { "acc": 0.782051282051282, "acc_stderr": 0.02704685763071668, "acc_norm": 0.782051282051282, "acc_norm_stderr": 0.02704685763071668 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.55, "acc_stderr": 0.04999999999999999, "acc_norm": 0.55, "acc_norm_stderr": 0.04999999999999999 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.7675606641123882, "acc_stderr": 0.015104550008905716, "acc_norm": 0.7675606641123882, "acc_norm_stderr": 0.015104550008905716 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.5953757225433526, "acc_stderr": 0.02642481659400985, "acc_norm": 0.5953757225433526, "acc_norm_stderr": 0.02642481659400985 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.3553072625698324, "acc_stderr": 0.016006989934803182, "acc_norm": 0.3553072625698324, "acc_norm_stderr": 0.016006989934803182 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.5816993464052288, "acc_stderr": 0.028245134024387292, "acc_norm": 0.5816993464052288, "acc_norm_stderr": 0.028245134024387292 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.6237942122186495, "acc_stderr": 0.02751392568354943, "acc_norm": 0.6237942122186495, "acc_norm_stderr": 0.02751392568354943 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.6388888888888888, "acc_stderr": 0.02672586880910079, "acc_norm": 0.6388888888888888, "acc_norm_stderr": 0.02672586880910079 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.40070921985815605, "acc_stderr": 0.02923346574557309, "acc_norm": 0.40070921985815605, "acc_norm_stderr": 0.02923346574557309 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.4152542372881356, "acc_stderr": 0.012585471793400659, "acc_norm": 0.4152542372881356, "acc_norm_stderr": 0.012585471793400659 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.5147058823529411, "acc_stderr": 0.03035969707904612, "acc_norm": 0.5147058823529411, "acc_norm_stderr": 0.03035969707904612 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.5424836601307189, "acc_stderr": 0.020154685712590898, "acc_norm": 0.5424836601307189, "acc_norm_stderr": 0.020154685712590898 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.5909090909090909, "acc_stderr": 0.04709306978661895, "acc_norm": 0.5909090909090909, "acc_norm_stderr": 0.04709306978661895 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.5959183673469388, "acc_stderr": 0.03141470802586589, "acc_norm": 0.5959183673469388, "acc_norm_stderr": 0.03141470802586589 }, "harness|hendrycksTest-sociology|5": { "acc": 0.736318407960199, "acc_stderr": 0.031157150869355558, "acc_norm": 0.736318407960199, "acc_norm_stderr": 0.031157150869355558 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.81, "acc_stderr": 0.03942772444036625, "acc_norm": 0.81, "acc_norm_stderr": 0.03942772444036625 }, "harness|hendrycksTest-virology|5": { "acc": 0.42771084337349397, "acc_stderr": 0.038515976837185335, "acc_norm": 0.42771084337349397, "acc_norm_stderr": 0.038515976837185335 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.7777777777777778, "acc_stderr": 0.03188578017686398, "acc_norm": 0.7777777777777778, "acc_norm_stderr": 0.03188578017686398 }, "harness|truthfulqa:mc|0": { "mc1": 0.35006119951040393, "mc1_stderr": 0.01669794942015103, "mc2": 0.5043477199409111, "mc2_stderr": 0.015764099492460493 }, "harness|winogrande|5": { "acc": 0.7521704814522494, "acc_stderr": 0.012134386019865348 }, "harness|gsm8k|5": { "acc": 0.20166793025018953, "acc_stderr": 0.01105229588954436 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_PotatoOff__Michel-13B
[ "region:us" ]
2024-02-01T22:08:01+00:00
{"pretty_name": "Evaluation run of PotatoOff/Michel-13B", "dataset_summary": "Dataset automatically created during the evaluation run of model [PotatoOff/Michel-13B](https://huggingface.co/PotatoOff/Michel-13B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_PotatoOff__Michel-13B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-01T22:05:33.263550](https://huggingface.co/datasets/open-llm-leaderboard/details_PotatoOff__Michel-13B/blob/main/results_2024-02-01T22-05-33.263550.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5499141517864713,\n \"acc_stderr\": 0.034037566570250866,\n \"acc_norm\": 0.556324343200096,\n \"acc_norm_stderr\": 0.03477678629039932,\n \"mc1\": 0.35006119951040393,\n \"mc1_stderr\": 0.01669794942015103,\n \"mc2\": 0.5043477199409111,\n \"mc2_stderr\": 0.015764099492460493\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5767918088737202,\n \"acc_stderr\": 0.014438036220848034,\n \"acc_norm\": 0.6126279863481229,\n \"acc_norm_stderr\": 0.01423587248790987\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6357299342760406,\n \"acc_stderr\": 0.004802413919932666,\n \"acc_norm\": 0.832105158334993,\n \"acc_norm_stderr\": 0.0037300899105375796\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4666666666666667,\n \"acc_stderr\": 0.043097329010363554,\n \"acc_norm\": 0.4666666666666667,\n \"acc_norm_stderr\": 0.043097329010363554\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.5592105263157895,\n \"acc_stderr\": 0.04040311062490436,\n \"acc_norm\": 0.5592105263157895,\n \"acc_norm_stderr\": 0.04040311062490436\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.5584905660377358,\n \"acc_stderr\": 0.030561590426731837,\n \"acc_norm\": 0.5584905660377358,\n \"acc_norm_stderr\": 0.030561590426731837\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6180555555555556,\n \"acc_stderr\": 0.040629907841466674,\n \"acc_norm\": 0.6180555555555556,\n \"acc_norm_stderr\": 0.040629907841466674\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.45,\n \"acc_stderr\": 0.04999999999999999,\n \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.04999999999999999\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.4913294797687861,\n \"acc_stderr\": 0.038118909889404126,\n \"acc_norm\": 0.4913294797687861,\n \"acc_norm_stderr\": 0.038118909889404126\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.2549019607843137,\n \"acc_stderr\": 0.043364327079931785,\n \"acc_norm\": 0.2549019607843137,\n \"acc_norm_stderr\": 0.043364327079931785\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\": 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.46808510638297873,\n \"acc_stderr\": 0.03261936918467382,\n \"acc_norm\": 0.46808510638297873,\n \"acc_norm_stderr\": 0.03261936918467382\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3157894736842105,\n \"acc_stderr\": 0.043727482902780064,\n \"acc_norm\": 0.3157894736842105,\n \"acc_norm_stderr\": 0.043727482902780064\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.45517241379310347,\n \"acc_stderr\": 0.04149886942192117,\n \"acc_norm\": 0.45517241379310347,\n \"acc_norm_stderr\": 0.04149886942192117\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.35714285714285715,\n \"acc_stderr\": 0.024677862841332783,\n \"acc_norm\": 0.35714285714285715,\n \"acc_norm_stderr\": 0.024677862841332783\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.36507936507936506,\n \"acc_stderr\": 0.04306241259127153,\n \"acc_norm\": 0.36507936507936506,\n \"acc_norm_stderr\": 0.04306241259127153\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145633,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145633\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6258064516129033,\n \"acc_stderr\": 0.0275289042998457,\n \"acc_norm\": 0.6258064516129033,\n \"acc_norm_stderr\": 0.0275289042998457\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.42857142857142855,\n \"acc_stderr\": 0.03481904844438803,\n \"acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.03481904844438803\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.6121212121212121,\n \"acc_stderr\": 0.038049136539710114,\n \"acc_norm\": 0.6121212121212121,\n \"acc_norm_stderr\": 0.038049136539710114\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.6868686868686869,\n \"acc_stderr\": 0.033042050878136525,\n \"acc_norm\": 0.6868686868686869,\n \"acc_norm_stderr\": 0.033042050878136525\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.7772020725388601,\n \"acc_stderr\": 0.030031147977641538,\n \"acc_norm\": 0.7772020725388601,\n \"acc_norm_stderr\": 0.030031147977641538\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.4846153846153846,\n \"acc_stderr\": 0.025339003010106515,\n \"acc_norm\": 0.4846153846153846,\n \"acc_norm_stderr\": 0.025339003010106515\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.337037037037037,\n \"acc_stderr\": 0.028820884666253252,\n \"acc_norm\": 0.337037037037037,\n \"acc_norm_stderr\": 0.028820884666253252\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.5672268907563025,\n \"acc_stderr\": 0.03218358107742613,\n \"acc_norm\": 0.5672268907563025,\n \"acc_norm_stderr\": 0.03218358107742613\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3841059602649007,\n \"acc_stderr\": 0.03971301814719197,\n \"acc_norm\": 0.3841059602649007,\n \"acc_norm_stderr\": 0.03971301814719197\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.7247706422018348,\n \"acc_stderr\": 0.019149093743155203,\n \"acc_norm\": 0.7247706422018348,\n \"acc_norm_stderr\": 0.019149093743155203\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.48148148148148145,\n \"acc_stderr\": 0.03407632093854053,\n \"acc_norm\": 0.48148148148148145,\n \"acc_norm_stderr\": 0.03407632093854053\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7450980392156863,\n \"acc_stderr\": 0.030587591351604243,\n \"acc_norm\": 0.7450980392156863,\n \"acc_norm_stderr\": 0.030587591351604243\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7172995780590717,\n \"acc_stderr\": 0.02931281415395593,\n \"acc_norm\": 0.7172995780590717,\n \"acc_norm_stderr\": 0.02931281415395593\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6233183856502242,\n \"acc_stderr\": 0.03252113489929188,\n \"acc_norm\": 0.6233183856502242,\n \"acc_norm_stderr\": 0.03252113489929188\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.5801526717557252,\n \"acc_stderr\": 0.043285772152629715,\n \"acc_norm\": 0.5801526717557252,\n \"acc_norm_stderr\": 0.043285772152629715\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7272727272727273,\n \"acc_stderr\": 0.04065578140908706,\n \"acc_norm\": 0.7272727272727273,\n \"acc_norm_stderr\": 0.04065578140908706\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6851851851851852,\n \"acc_stderr\": 0.04489931073591312,\n \"acc_norm\": 0.6851851851851852,\n \"acc_norm_stderr\": 0.04489931073591312\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.6625766871165644,\n \"acc_stderr\": 0.03714908409935574,\n \"acc_norm\": 0.6625766871165644,\n \"acc_norm_stderr\": 0.03714908409935574\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.39285714285714285,\n \"acc_stderr\": 0.046355501356099754,\n \"acc_norm\": 0.39285714285714285,\n \"acc_norm_stderr\": 0.046355501356099754\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7184466019417476,\n \"acc_stderr\": 0.044532548363264673,\n \"acc_norm\": 0.7184466019417476,\n \"acc_norm_stderr\": 0.044532548363264673\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.782051282051282,\n \"acc_stderr\": 0.02704685763071668,\n \"acc_norm\": 0.782051282051282,\n \"acc_norm_stderr\": 0.02704685763071668\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.55,\n \"acc_stderr\": 0.04999999999999999,\n \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.04999999999999999\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7675606641123882,\n \"acc_stderr\": 0.015104550008905716,\n \"acc_norm\": 0.7675606641123882,\n \"acc_norm_stderr\": 0.015104550008905716\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.5953757225433526,\n \"acc_stderr\": 0.02642481659400985,\n \"acc_norm\": 0.5953757225433526,\n \"acc_norm_stderr\": 0.02642481659400985\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3553072625698324,\n \"acc_stderr\": 0.016006989934803182,\n \"acc_norm\": 0.3553072625698324,\n \"acc_norm_stderr\": 0.016006989934803182\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.5816993464052288,\n \"acc_stderr\": 0.028245134024387292,\n \"acc_norm\": 0.5816993464052288,\n \"acc_norm_stderr\": 0.028245134024387292\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6237942122186495,\n \"acc_stderr\": 0.02751392568354943,\n \"acc_norm\": 0.6237942122186495,\n \"acc_norm_stderr\": 0.02751392568354943\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.6388888888888888,\n \"acc_stderr\": 0.02672586880910079,\n \"acc_norm\": 0.6388888888888888,\n \"acc_norm_stderr\": 0.02672586880910079\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.40070921985815605,\n \"acc_stderr\": 0.02923346574557309,\n \"acc_norm\": 0.40070921985815605,\n \"acc_norm_stderr\": 0.02923346574557309\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4152542372881356,\n \"acc_stderr\": 0.012585471793400659,\n \"acc_norm\": 0.4152542372881356,\n \"acc_norm_stderr\": 0.012585471793400659\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.5147058823529411,\n \"acc_stderr\": 0.03035969707904612,\n \"acc_norm\": 0.5147058823529411,\n \"acc_norm_stderr\": 0.03035969707904612\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.5424836601307189,\n \"acc_stderr\": 0.020154685712590898,\n \"acc_norm\": 0.5424836601307189,\n \"acc_norm_stderr\": 0.020154685712590898\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5909090909090909,\n \"acc_stderr\": 0.04709306978661895,\n \"acc_norm\": 0.5909090909090909,\n \"acc_norm_stderr\": 0.04709306978661895\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.5959183673469388,\n \"acc_stderr\": 0.03141470802586589,\n \"acc_norm\": 0.5959183673469388,\n \"acc_norm_stderr\": 0.03141470802586589\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.736318407960199,\n \"acc_stderr\": 0.031157150869355558,\n \"acc_norm\": 0.736318407960199,\n \"acc_norm_stderr\": 0.031157150869355558\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.81,\n \"acc_stderr\": 0.03942772444036625,\n \"acc_norm\": 0.81,\n \"acc_norm_stderr\": 0.03942772444036625\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.42771084337349397,\n \"acc_stderr\": 0.038515976837185335,\n \"acc_norm\": 0.42771084337349397,\n \"acc_norm_stderr\": 0.038515976837185335\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.03188578017686398,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.03188578017686398\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.35006119951040393,\n \"mc1_stderr\": 0.01669794942015103,\n \"mc2\": 0.5043477199409111,\n \"mc2_stderr\": 0.015764099492460493\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7521704814522494,\n \"acc_stderr\": 0.012134386019865348\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.20166793025018953,\n \"acc_stderr\": 0.01105229588954436\n }\n}\n```", "repo_url": "https://huggingface.co/PotatoOff/Michel-13B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_01T22_05_33.263550", "path": ["**/details_harness|arc:challenge|25_2024-02-01T22-05-33.263550.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-01T22-05-33.263550.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_01T22_05_33.263550", "path": ["**/details_harness|gsm8k|5_2024-02-01T22-05-33.263550.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-01T22-05-33.263550.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_01T22_05_33.263550", "path": ["**/details_harness|hellaswag|10_2024-02-01T22-05-33.263550.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-01T22-05-33.263550.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_01T22_05_33.263550", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T22-05-33.263550.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-01T22-05-33.263550.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-01T22-05-33.263550.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T22-05-33.263550.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T22-05-33.263550.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-01T22-05-33.263550.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T22-05-33.263550.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T22-05-33.263550.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T22-05-33.263550.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T22-05-33.263550.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-01T22-05-33.263550.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-01T22-05-33.263550.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T22-05-33.263550.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-01T22-05-33.263550.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T22-05-33.263550.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T22-05-33.263550.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T22-05-33.263550.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-01T22-05-33.263550.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T22-05-33.263550.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T22-05-33.263550.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T22-05-33.263550.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T22-05-33.263550.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T22-05-33.263550.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T22-05-33.263550.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T22-05-33.263550.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T22-05-33.263550.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T22-05-33.263550.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T22-05-33.263550.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T22-05-33.263550.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T22-05-33.263550.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T22-05-33.263550.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T22-05-33.263550.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-01T22-05-33.263550.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T22-05-33.263550.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-01T22-05-33.263550.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T22-05-33.263550.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T22-05-33.263550.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T22-05-33.263550.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-01T22-05-33.263550.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-01T22-05-33.263550.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T22-05-33.263550.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T22-05-33.263550.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T22-05-33.263550.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T22-05-33.263550.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-01T22-05-33.263550.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-01T22-05-33.263550.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-01T22-05-33.263550.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T22-05-33.263550.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-01T22-05-33.263550.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T22-05-33.263550.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T22-05-33.263550.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-01T22-05-33.263550.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-01T22-05-33.263550.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-01T22-05-33.263550.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T22-05-33.263550.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-01T22-05-33.263550.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-01T22-05-33.263550.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T22-05-33.263550.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-01T22-05-33.263550.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-01T22-05-33.263550.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T22-05-33.263550.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T22-05-33.263550.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-01T22-05-33.263550.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T22-05-33.263550.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T22-05-33.263550.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T22-05-33.263550.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T22-05-33.263550.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-01T22-05-33.263550.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-01T22-05-33.263550.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T22-05-33.263550.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-01T22-05-33.263550.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T22-05-33.263550.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T22-05-33.263550.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T22-05-33.263550.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-01T22-05-33.263550.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T22-05-33.263550.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T22-05-33.263550.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T22-05-33.263550.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T22-05-33.263550.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T22-05-33.263550.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T22-05-33.263550.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T22-05-33.263550.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T22-05-33.263550.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T22-05-33.263550.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T22-05-33.263550.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T22-05-33.263550.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T22-05-33.263550.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T22-05-33.263550.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T22-05-33.263550.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-01T22-05-33.263550.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T22-05-33.263550.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-01T22-05-33.263550.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T22-05-33.263550.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T22-05-33.263550.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T22-05-33.263550.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-01T22-05-33.263550.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-01T22-05-33.263550.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T22-05-33.263550.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T22-05-33.263550.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T22-05-33.263550.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T22-05-33.263550.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-01T22-05-33.263550.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-01T22-05-33.263550.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-01T22-05-33.263550.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T22-05-33.263550.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-01T22-05-33.263550.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T22-05-33.263550.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T22-05-33.263550.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-01T22-05-33.263550.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-01T22-05-33.263550.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-01T22-05-33.263550.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T22-05-33.263550.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-01T22-05-33.263550.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-01T22-05-33.263550.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_01T22_05_33.263550", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T22-05-33.263550.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T22-05-33.263550.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_01T22_05_33.263550", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-01T22-05-33.263550.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-01T22-05-33.263550.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_01T22_05_33.263550", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-01T22-05-33.263550.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-01T22-05-33.263550.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_01T22_05_33.263550", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T22-05-33.263550.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T22-05-33.263550.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_01T22_05_33.263550", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T22-05-33.263550.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T22-05-33.263550.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_01T22_05_33.263550", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-01T22-05-33.263550.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-01T22-05-33.263550.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_01T22_05_33.263550", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T22-05-33.263550.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T22-05-33.263550.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_01T22_05_33.263550", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T22-05-33.263550.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T22-05-33.263550.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_01T22_05_33.263550", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T22-05-33.263550.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T22-05-33.263550.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_01T22_05_33.263550", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T22-05-33.263550.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T22-05-33.263550.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_01T22_05_33.263550", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-01T22-05-33.263550.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-01T22-05-33.263550.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_01T22_05_33.263550", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-01T22-05-33.263550.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-01T22-05-33.263550.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_01T22_05_33.263550", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T22-05-33.263550.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T22-05-33.263550.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_01T22_05_33.263550", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-01T22-05-33.263550.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-01T22-05-33.263550.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_01T22_05_33.263550", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T22-05-33.263550.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T22-05-33.263550.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_01T22_05_33.263550", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T22-05-33.263550.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T22-05-33.263550.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_01T22_05_33.263550", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T22-05-33.263550.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T22-05-33.263550.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_01T22_05_33.263550", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-01T22-05-33.263550.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-01T22-05-33.263550.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_01T22_05_33.263550", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T22-05-33.263550.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T22-05-33.263550.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_01T22_05_33.263550", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T22-05-33.263550.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T22-05-33.263550.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_01T22_05_33.263550", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T22-05-33.263550.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T22-05-33.263550.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_01T22_05_33.263550", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T22-05-33.263550.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T22-05-33.263550.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_01T22_05_33.263550", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T22-05-33.263550.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T22-05-33.263550.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_01T22_05_33.263550", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T22-05-33.263550.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T22-05-33.263550.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_01T22_05_33.263550", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T22-05-33.263550.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T22-05-33.263550.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_01T22_05_33.263550", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T22-05-33.263550.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T22-05-33.263550.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_01T22_05_33.263550", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T22-05-33.263550.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T22-05-33.263550.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_01T22_05_33.263550", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T22-05-33.263550.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T22-05-33.263550.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_01T22_05_33.263550", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T22-05-33.263550.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T22-05-33.263550.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_01T22_05_33.263550", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T22-05-33.263550.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T22-05-33.263550.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_01T22_05_33.263550", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T22-05-33.263550.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T22-05-33.263550.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_01T22_05_33.263550", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T22-05-33.263550.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T22-05-33.263550.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_01T22_05_33.263550", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-01T22-05-33.263550.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-01T22-05-33.263550.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_01T22_05_33.263550", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T22-05-33.263550.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T22-05-33.263550.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_01T22_05_33.263550", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-01T22-05-33.263550.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-01T22-05-33.263550.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_01T22_05_33.263550", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T22-05-33.263550.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T22-05-33.263550.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_01T22_05_33.263550", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T22-05-33.263550.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T22-05-33.263550.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_01T22_05_33.263550", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T22-05-33.263550.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T22-05-33.263550.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_01T22_05_33.263550", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-01T22-05-33.263550.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-01T22-05-33.263550.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_01T22_05_33.263550", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-01T22-05-33.263550.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-01T22-05-33.263550.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_01T22_05_33.263550", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T22-05-33.263550.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T22-05-33.263550.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_01T22_05_33.263550", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T22-05-33.263550.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T22-05-33.263550.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_01T22_05_33.263550", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T22-05-33.263550.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T22-05-33.263550.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_01T22_05_33.263550", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T22-05-33.263550.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T22-05-33.263550.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_01T22_05_33.263550", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-01T22-05-33.263550.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-01T22-05-33.263550.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_01T22_05_33.263550", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-01T22-05-33.263550.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-01T22-05-33.263550.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_01T22_05_33.263550", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-01T22-05-33.263550.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-01T22-05-33.263550.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_01T22_05_33.263550", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T22-05-33.263550.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T22-05-33.263550.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_01T22_05_33.263550", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-01T22-05-33.263550.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-01T22-05-33.263550.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_01T22_05_33.263550", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T22-05-33.263550.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T22-05-33.263550.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_01T22_05_33.263550", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T22-05-33.263550.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T22-05-33.263550.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_01T22_05_33.263550", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-01T22-05-33.263550.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-01T22-05-33.263550.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_01T22_05_33.263550", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-01T22-05-33.263550.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-01T22-05-33.263550.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_01T22_05_33.263550", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-01T22-05-33.263550.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-01T22-05-33.263550.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_01T22_05_33.263550", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T22-05-33.263550.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T22-05-33.263550.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_01T22_05_33.263550", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-01T22-05-33.263550.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-01T22-05-33.263550.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_01T22_05_33.263550", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-01T22-05-33.263550.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-01T22-05-33.263550.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_01T22_05_33.263550", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-01T22-05-33.263550.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-01T22-05-33.263550.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_01T22_05_33.263550", "path": ["**/details_harness|winogrande|5_2024-02-01T22-05-33.263550.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-01T22-05-33.263550.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_01T22_05_33.263550", "path": ["results_2024-02-01T22-05-33.263550.parquet"]}, {"split": "latest", "path": ["results_2024-02-01T22-05-33.263550.parquet"]}]}]}
2024-02-01T22:08:31+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of PotatoOff/Michel-13B Dataset automatically created during the evaluation run of model PotatoOff/Michel-13B on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-02-01T22:05:33.263550(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of PotatoOff/Michel-13B\n\n\n\nDataset automatically created during the evaluation run of model PotatoOff/Michel-13B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-02-01T22:05:33.263550(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of PotatoOff/Michel-13B\n\n\n\nDataset automatically created during the evaluation run of model PotatoOff/Michel-13B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-02-01T22:05:33.263550(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
799dc88d46ca113c3b0964197c863a81b1d65148
# Dataset Card for Evaluation run of MRAIRR/mini_7B_dare_v1 <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [MRAIRR/mini_7B_dare_v1](https://huggingface.co/MRAIRR/mini_7B_dare_v1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_MRAIRR__mini_7B_dare_v1", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-02-01T22:06:49.514439](https://huggingface.co/datasets/open-llm-leaderboard/details_MRAIRR__mini_7B_dare_v1/blob/main/results_2024-02-01T22-06-49.514439.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.5971205134841852, "acc_stderr": 0.03305252247330764, "acc_norm": 0.5993597039453373, "acc_norm_stderr": 0.03371332021374718, "mc1": 0.3806609547123623, "mc1_stderr": 0.01699762787190793, "mc2": 0.5464175107671695, "mc2_stderr": 0.01554949662717814 }, "harness|arc:challenge|25": { "acc": 0.5784982935153583, "acc_stderr": 0.014430197069326023, "acc_norm": 0.6177474402730375, "acc_norm_stderr": 0.014200454049979282 }, "harness|hellaswag|10": { "acc": 0.595399322844055, "acc_stderr": 0.004898115110975035, "acc_norm": 0.7991435968930491, "acc_norm_stderr": 0.003998220753048877 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.36, "acc_stderr": 0.04824181513244218, "acc_norm": 0.36, "acc_norm_stderr": 0.04824181513244218 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.5555555555555556, "acc_stderr": 0.04292596718256981, "acc_norm": 0.5555555555555556, "acc_norm_stderr": 0.04292596718256981 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.618421052631579, "acc_stderr": 0.03953173377749194, "acc_norm": 0.618421052631579, "acc_norm_stderr": 0.03953173377749194 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.57, "acc_stderr": 0.049756985195624284, "acc_norm": 0.57, "acc_norm_stderr": 0.049756985195624284 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.6339622641509434, "acc_stderr": 0.029647813539365252, "acc_norm": 0.6339622641509434, "acc_norm_stderr": 0.029647813539365252 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.7083333333333334, "acc_stderr": 0.03800968060554859, "acc_norm": 0.7083333333333334, "acc_norm_stderr": 0.03800968060554859 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.42, "acc_stderr": 0.04960449637488584, "acc_norm": 0.42, "acc_norm_stderr": 0.04960449637488584 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.48, "acc_stderr": 0.050211673156867795, "acc_norm": 0.48, "acc_norm_stderr": 0.050211673156867795 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.36, "acc_stderr": 0.048241815132442176, "acc_norm": 0.36, "acc_norm_stderr": 0.048241815132442176 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.5375722543352601, "acc_stderr": 0.0380168510452446, "acc_norm": 0.5375722543352601, "acc_norm_stderr": 0.0380168510452446 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.3137254901960784, "acc_stderr": 0.04617034827006716, "acc_norm": 0.3137254901960784, "acc_norm_stderr": 0.04617034827006716 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.75, "acc_stderr": 0.04351941398892446, "acc_norm": 0.75, "acc_norm_stderr": 0.04351941398892446 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.5063829787234042, "acc_stderr": 0.032683358999363366, "acc_norm": 0.5063829787234042, "acc_norm_stderr": 0.032683358999363366 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.42105263157894735, "acc_stderr": 0.046446020912223177, "acc_norm": 0.42105263157894735, "acc_norm_stderr": 0.046446020912223177 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5103448275862069, "acc_stderr": 0.04165774775728763, "acc_norm": 0.5103448275862069, "acc_norm_stderr": 0.04165774775728763 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.37566137566137564, "acc_stderr": 0.02494236893115979, "acc_norm": 0.37566137566137564, "acc_norm_stderr": 0.02494236893115979 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.40476190476190477, "acc_stderr": 0.04390259265377561, "acc_norm": 0.40476190476190477, "acc_norm_stderr": 0.04390259265377561 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.28, "acc_stderr": 0.045126085985421276, "acc_norm": 0.28, "acc_norm_stderr": 0.045126085985421276 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.7193548387096774, "acc_stderr": 0.025560604721022895, "acc_norm": 0.7193548387096774, "acc_norm_stderr": 0.025560604721022895 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.47783251231527096, "acc_stderr": 0.035145285621750094, "acc_norm": 0.47783251231527096, "acc_norm_stderr": 0.035145285621750094 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.68, "acc_stderr": 0.04688261722621504, "acc_norm": 0.68, "acc_norm_stderr": 0.04688261722621504 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7393939393939394, "acc_stderr": 0.034277431758165236, "acc_norm": 0.7393939393939394, "acc_norm_stderr": 0.034277431758165236 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.7525252525252525, "acc_stderr": 0.030746300742124488, "acc_norm": 0.7525252525252525, "acc_norm_stderr": 0.030746300742124488 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.8497409326424871, "acc_stderr": 0.025787723180723872, "acc_norm": 0.8497409326424871, "acc_norm_stderr": 0.025787723180723872 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.5641025641025641, "acc_stderr": 0.02514180151117749, "acc_norm": 0.5641025641025641, "acc_norm_stderr": 0.02514180151117749 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.32592592592592595, "acc_stderr": 0.028578348365473075, "acc_norm": 0.32592592592592595, "acc_norm_stderr": 0.028578348365473075 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.6008403361344538, "acc_stderr": 0.03181110032413926, "acc_norm": 0.6008403361344538, "acc_norm_stderr": 0.03181110032413926 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.31125827814569534, "acc_stderr": 0.03780445850526732, "acc_norm": 0.31125827814569534, "acc_norm_stderr": 0.03780445850526732 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.781651376146789, "acc_stderr": 0.017712600528722724, "acc_norm": 0.781651376146789, "acc_norm_stderr": 0.017712600528722724 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.3425925925925926, "acc_stderr": 0.03236585252602158, "acc_norm": 0.3425925925925926, "acc_norm_stderr": 0.03236585252602158 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.7892156862745098, "acc_stderr": 0.028626547912437406, "acc_norm": 0.7892156862745098, "acc_norm_stderr": 0.028626547912437406 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.7679324894514767, "acc_stderr": 0.02747974455080851, "acc_norm": 0.7679324894514767, "acc_norm_stderr": 0.02747974455080851 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.7040358744394619, "acc_stderr": 0.0306365913486998, "acc_norm": 0.7040358744394619, "acc_norm_stderr": 0.0306365913486998 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.6717557251908397, "acc_stderr": 0.04118438565806298, "acc_norm": 0.6717557251908397, "acc_norm_stderr": 0.04118438565806298 }, "harness|hendrycksTest-international_law|5": { "acc": 0.7851239669421488, "acc_stderr": 0.037494924487096966, "acc_norm": 0.7851239669421488, "acc_norm_stderr": 0.037494924487096966 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7314814814814815, "acc_stderr": 0.042844679680521934, "acc_norm": 0.7314814814814815, "acc_norm_stderr": 0.042844679680521934 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7116564417177914, "acc_stderr": 0.03559039531617342, "acc_norm": 0.7116564417177914, "acc_norm_stderr": 0.03559039531617342 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.4732142857142857, "acc_stderr": 0.047389751192741546, "acc_norm": 0.4732142857142857, "acc_norm_stderr": 0.047389751192741546 }, "harness|hendrycksTest-management|5": { "acc": 0.7378640776699029, "acc_stderr": 0.04354631077260595, "acc_norm": 0.7378640776699029, "acc_norm_stderr": 0.04354631077260595 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8846153846153846, "acc_stderr": 0.020930193185179333, "acc_norm": 0.8846153846153846, "acc_norm_stderr": 0.020930193185179333 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.66, "acc_stderr": 0.04760952285695237, "acc_norm": 0.66, "acc_norm_stderr": 0.04760952285695237 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.7828863346104725, "acc_stderr": 0.014743125394823298, "acc_norm": 0.7828863346104725, "acc_norm_stderr": 0.014743125394823298 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.6734104046242775, "acc_stderr": 0.02524826477424284, "acc_norm": 0.6734104046242775, "acc_norm_stderr": 0.02524826477424284 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.29497206703910617, "acc_stderr": 0.015251931579208181, "acc_norm": 0.29497206703910617, "acc_norm_stderr": 0.015251931579208181 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.673202614379085, "acc_stderr": 0.026857294663281406, "acc_norm": 0.673202614379085, "acc_norm_stderr": 0.026857294663281406 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.6688102893890675, "acc_stderr": 0.026730620728004903, "acc_norm": 0.6688102893890675, "acc_norm_stderr": 0.026730620728004903 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.6697530864197531, "acc_stderr": 0.026168298456732846, "acc_norm": 0.6697530864197531, "acc_norm_stderr": 0.026168298456732846 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.425531914893617, "acc_stderr": 0.029494827600144376, "acc_norm": 0.425531914893617, "acc_norm_stderr": 0.029494827600144376 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.4361147327249022, "acc_stderr": 0.012665568135455328, "acc_norm": 0.4361147327249022, "acc_norm_stderr": 0.012665568135455328 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.5514705882352942, "acc_stderr": 0.030211479609121596, "acc_norm": 0.5514705882352942, "acc_norm_stderr": 0.030211479609121596 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6111111111111112, "acc_stderr": 0.019722058939618068, "acc_norm": 0.6111111111111112, "acc_norm_stderr": 0.019722058939618068 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6454545454545455, "acc_stderr": 0.045820048415054174, "acc_norm": 0.6454545454545455, "acc_norm_stderr": 0.045820048415054174 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.6693877551020408, "acc_stderr": 0.030116426296540603, "acc_norm": 0.6693877551020408, "acc_norm_stderr": 0.030116426296540603 }, "harness|hendrycksTest-sociology|5": { "acc": 0.8159203980099502, "acc_stderr": 0.027403859410786848, "acc_norm": 0.8159203980099502, "acc_norm_stderr": 0.027403859410786848 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.84, "acc_stderr": 0.03684529491774708, "acc_norm": 0.84, "acc_norm_stderr": 0.03684529491774708 }, "harness|hendrycksTest-virology|5": { "acc": 0.5240963855421686, "acc_stderr": 0.03887971849597264, "acc_norm": 0.5240963855421686, "acc_norm_stderr": 0.03887971849597264 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.7953216374269005, "acc_stderr": 0.030944459778533207, "acc_norm": 0.7953216374269005, "acc_norm_stderr": 0.030944459778533207 }, "harness|truthfulqa:mc|0": { "mc1": 0.3806609547123623, "mc1_stderr": 0.01699762787190793, "mc2": 0.5464175107671695, "mc2_stderr": 0.01554949662717814 }, "harness|winogrande|5": { "acc": 0.739542225730071, "acc_stderr": 0.01233483367199829 }, "harness|gsm8k|5": { "acc": 0.5655799848369977, "acc_stderr": 0.013653507211411415 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_MRAIRR__mini_7B_dare_v1
[ "region:us" ]
2024-02-01T22:09:13+00:00
{"pretty_name": "Evaluation run of MRAIRR/mini_7B_dare_v1", "dataset_summary": "Dataset automatically created during the evaluation run of model [MRAIRR/mini_7B_dare_v1](https://huggingface.co/MRAIRR/mini_7B_dare_v1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_MRAIRR__mini_7B_dare_v1\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-01T22:06:49.514439](https://huggingface.co/datasets/open-llm-leaderboard/details_MRAIRR__mini_7B_dare_v1/blob/main/results_2024-02-01T22-06-49.514439.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5971205134841852,\n \"acc_stderr\": 0.03305252247330764,\n \"acc_norm\": 0.5993597039453373,\n \"acc_norm_stderr\": 0.03371332021374718,\n \"mc1\": 0.3806609547123623,\n \"mc1_stderr\": 0.01699762787190793,\n \"mc2\": 0.5464175107671695,\n \"mc2_stderr\": 0.01554949662717814\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5784982935153583,\n \"acc_stderr\": 0.014430197069326023,\n \"acc_norm\": 0.6177474402730375,\n \"acc_norm_stderr\": 0.014200454049979282\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.595399322844055,\n \"acc_stderr\": 0.004898115110975035,\n \"acc_norm\": 0.7991435968930491,\n \"acc_norm_stderr\": 0.003998220753048877\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5555555555555556,\n \"acc_stderr\": 0.04292596718256981,\n \"acc_norm\": 0.5555555555555556,\n \"acc_norm_stderr\": 0.04292596718256981\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.618421052631579,\n \"acc_stderr\": 0.03953173377749194,\n \"acc_norm\": 0.618421052631579,\n \"acc_norm_stderr\": 0.03953173377749194\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.57,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6339622641509434,\n \"acc_stderr\": 0.029647813539365252,\n \"acc_norm\": 0.6339622641509434,\n \"acc_norm_stderr\": 0.029647813539365252\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7083333333333334,\n \"acc_stderr\": 0.03800968060554859,\n \"acc_norm\": 0.7083333333333334,\n \"acc_norm_stderr\": 0.03800968060554859\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.42,\n \"acc_stderr\": 0.04960449637488584,\n \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.04960449637488584\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5375722543352601,\n \"acc_stderr\": 0.0380168510452446,\n \"acc_norm\": 0.5375722543352601,\n \"acc_norm_stderr\": 0.0380168510452446\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.3137254901960784,\n \"acc_stderr\": 0.04617034827006716,\n \"acc_norm\": 0.3137254901960784,\n \"acc_norm_stderr\": 0.04617034827006716\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5063829787234042,\n \"acc_stderr\": 0.032683358999363366,\n \"acc_norm\": 0.5063829787234042,\n \"acc_norm_stderr\": 0.032683358999363366\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.42105263157894735,\n \"acc_stderr\": 0.046446020912223177,\n \"acc_norm\": 0.42105263157894735,\n \"acc_norm_stderr\": 0.046446020912223177\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5103448275862069,\n \"acc_stderr\": 0.04165774775728763,\n \"acc_norm\": 0.5103448275862069,\n \"acc_norm_stderr\": 0.04165774775728763\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.37566137566137564,\n \"acc_stderr\": 0.02494236893115979,\n \"acc_norm\": 0.37566137566137564,\n \"acc_norm_stderr\": 0.02494236893115979\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.40476190476190477,\n \"acc_stderr\": 0.04390259265377561,\n \"acc_norm\": 0.40476190476190477,\n \"acc_norm_stderr\": 0.04390259265377561\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7193548387096774,\n \"acc_stderr\": 0.025560604721022895,\n \"acc_norm\": 0.7193548387096774,\n \"acc_norm_stderr\": 0.025560604721022895\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.47783251231527096,\n \"acc_stderr\": 0.035145285621750094,\n \"acc_norm\": 0.47783251231527096,\n \"acc_norm_stderr\": 0.035145285621750094\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7393939393939394,\n \"acc_stderr\": 0.034277431758165236,\n \"acc_norm\": 0.7393939393939394,\n \"acc_norm_stderr\": 0.034277431758165236\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7525252525252525,\n \"acc_stderr\": 0.030746300742124488,\n \"acc_norm\": 0.7525252525252525,\n \"acc_norm_stderr\": 0.030746300742124488\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8497409326424871,\n \"acc_stderr\": 0.025787723180723872,\n \"acc_norm\": 0.8497409326424871,\n \"acc_norm_stderr\": 0.025787723180723872\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.5641025641025641,\n \"acc_stderr\": 0.02514180151117749,\n \"acc_norm\": 0.5641025641025641,\n \"acc_norm_stderr\": 0.02514180151117749\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.32592592592592595,\n \"acc_stderr\": 0.028578348365473075,\n \"acc_norm\": 0.32592592592592595,\n \"acc_norm_stderr\": 0.028578348365473075\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6008403361344538,\n \"acc_stderr\": 0.03181110032413926,\n \"acc_norm\": 0.6008403361344538,\n \"acc_norm_stderr\": 0.03181110032413926\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.31125827814569534,\n \"acc_stderr\": 0.03780445850526732,\n \"acc_norm\": 0.31125827814569534,\n \"acc_norm_stderr\": 0.03780445850526732\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.781651376146789,\n \"acc_stderr\": 0.017712600528722724,\n \"acc_norm\": 0.781651376146789,\n \"acc_norm_stderr\": 0.017712600528722724\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.3425925925925926,\n \"acc_stderr\": 0.03236585252602158,\n \"acc_norm\": 0.3425925925925926,\n \"acc_norm_stderr\": 0.03236585252602158\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7892156862745098,\n \"acc_stderr\": 0.028626547912437406,\n \"acc_norm\": 0.7892156862745098,\n \"acc_norm_stderr\": 0.028626547912437406\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7679324894514767,\n \"acc_stderr\": 0.02747974455080851,\n \"acc_norm\": 0.7679324894514767,\n \"acc_norm_stderr\": 0.02747974455080851\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7040358744394619,\n \"acc_stderr\": 0.0306365913486998,\n \"acc_norm\": 0.7040358744394619,\n \"acc_norm_stderr\": 0.0306365913486998\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.6717557251908397,\n \"acc_stderr\": 0.04118438565806298,\n \"acc_norm\": 0.6717557251908397,\n \"acc_norm_stderr\": 0.04118438565806298\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7314814814814815,\n \"acc_stderr\": 0.042844679680521934,\n \"acc_norm\": 0.7314814814814815,\n \"acc_norm_stderr\": 0.042844679680521934\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7116564417177914,\n \"acc_stderr\": 0.03559039531617342,\n \"acc_norm\": 0.7116564417177914,\n \"acc_norm_stderr\": 0.03559039531617342\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4732142857142857,\n \"acc_stderr\": 0.047389751192741546,\n \"acc_norm\": 0.4732142857142857,\n \"acc_norm_stderr\": 0.047389751192741546\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7378640776699029,\n \"acc_stderr\": 0.04354631077260595,\n \"acc_norm\": 0.7378640776699029,\n \"acc_norm_stderr\": 0.04354631077260595\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8846153846153846,\n \"acc_stderr\": 0.020930193185179333,\n \"acc_norm\": 0.8846153846153846,\n \"acc_norm_stderr\": 0.020930193185179333\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\": 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7828863346104725,\n \"acc_stderr\": 0.014743125394823298,\n \"acc_norm\": 0.7828863346104725,\n \"acc_norm_stderr\": 0.014743125394823298\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6734104046242775,\n \"acc_stderr\": 0.02524826477424284,\n \"acc_norm\": 0.6734104046242775,\n \"acc_norm_stderr\": 0.02524826477424284\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.29497206703910617,\n \"acc_stderr\": 0.015251931579208181,\n \"acc_norm\": 0.29497206703910617,\n \"acc_norm_stderr\": 0.015251931579208181\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.673202614379085,\n \"acc_stderr\": 0.026857294663281406,\n \"acc_norm\": 0.673202614379085,\n \"acc_norm_stderr\": 0.026857294663281406\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6688102893890675,\n \"acc_stderr\": 0.026730620728004903,\n \"acc_norm\": 0.6688102893890675,\n \"acc_norm_stderr\": 0.026730620728004903\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.6697530864197531,\n \"acc_stderr\": 0.026168298456732846,\n \"acc_norm\": 0.6697530864197531,\n \"acc_norm_stderr\": 0.026168298456732846\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.425531914893617,\n \"acc_stderr\": 0.029494827600144376,\n \"acc_norm\": 0.425531914893617,\n \"acc_norm_stderr\": 0.029494827600144376\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4361147327249022,\n \"acc_stderr\": 0.012665568135455328,\n \"acc_norm\": 0.4361147327249022,\n \"acc_norm_stderr\": 0.012665568135455328\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.5514705882352942,\n \"acc_stderr\": 0.030211479609121596,\n \"acc_norm\": 0.5514705882352942,\n \"acc_norm_stderr\": 0.030211479609121596\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6111111111111112,\n \"acc_stderr\": 0.019722058939618068,\n \"acc_norm\": 0.6111111111111112,\n \"acc_norm_stderr\": 0.019722058939618068\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6454545454545455,\n \"acc_stderr\": 0.045820048415054174,\n \"acc_norm\": 0.6454545454545455,\n \"acc_norm_stderr\": 0.045820048415054174\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.6693877551020408,\n \"acc_stderr\": 0.030116426296540603,\n \"acc_norm\": 0.6693877551020408,\n \"acc_norm_stderr\": 0.030116426296540603\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8159203980099502,\n \"acc_stderr\": 0.027403859410786848,\n \"acc_norm\": 0.8159203980099502,\n \"acc_norm_stderr\": 0.027403859410786848\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774708,\n \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774708\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5240963855421686,\n \"acc_stderr\": 0.03887971849597264,\n \"acc_norm\": 0.5240963855421686,\n \"acc_norm_stderr\": 0.03887971849597264\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.7953216374269005,\n \"acc_stderr\": 0.030944459778533207,\n \"acc_norm\": 0.7953216374269005,\n \"acc_norm_stderr\": 0.030944459778533207\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3806609547123623,\n \"mc1_stderr\": 0.01699762787190793,\n \"mc2\": 0.5464175107671695,\n \"mc2_stderr\": 0.01554949662717814\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.739542225730071,\n \"acc_stderr\": 0.01233483367199829\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5655799848369977,\n \"acc_stderr\": 0.013653507211411415\n }\n}\n```", "repo_url": "https://huggingface.co/MRAIRR/mini_7B_dare_v1", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_01T22_06_49.514439", "path": ["**/details_harness|arc:challenge|25_2024-02-01T22-06-49.514439.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-01T22-06-49.514439.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_01T22_06_49.514439", "path": ["**/details_harness|gsm8k|5_2024-02-01T22-06-49.514439.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-01T22-06-49.514439.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_01T22_06_49.514439", "path": ["**/details_harness|hellaswag|10_2024-02-01T22-06-49.514439.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-01T22-06-49.514439.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_01T22_06_49.514439", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T22-06-49.514439.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-01T22-06-49.514439.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-01T22-06-49.514439.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T22-06-49.514439.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T22-06-49.514439.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-01T22-06-49.514439.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T22-06-49.514439.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T22-06-49.514439.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T22-06-49.514439.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T22-06-49.514439.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-01T22-06-49.514439.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-01T22-06-49.514439.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T22-06-49.514439.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-01T22-06-49.514439.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T22-06-49.514439.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T22-06-49.514439.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T22-06-49.514439.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-01T22-06-49.514439.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T22-06-49.514439.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T22-06-49.514439.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T22-06-49.514439.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T22-06-49.514439.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T22-06-49.514439.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T22-06-49.514439.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T22-06-49.514439.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T22-06-49.514439.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T22-06-49.514439.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T22-06-49.514439.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T22-06-49.514439.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T22-06-49.514439.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T22-06-49.514439.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T22-06-49.514439.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-01T22-06-49.514439.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T22-06-49.514439.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-01T22-06-49.514439.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T22-06-49.514439.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T22-06-49.514439.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T22-06-49.514439.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-01T22-06-49.514439.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-01T22-06-49.514439.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T22-06-49.514439.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T22-06-49.514439.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T22-06-49.514439.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T22-06-49.514439.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-01T22-06-49.514439.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-01T22-06-49.514439.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-01T22-06-49.514439.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T22-06-49.514439.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-01T22-06-49.514439.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T22-06-49.514439.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T22-06-49.514439.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-01T22-06-49.514439.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-01T22-06-49.514439.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-01T22-06-49.514439.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T22-06-49.514439.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-01T22-06-49.514439.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-01T22-06-49.514439.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T22-06-49.514439.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-01T22-06-49.514439.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-01T22-06-49.514439.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T22-06-49.514439.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T22-06-49.514439.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-01T22-06-49.514439.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T22-06-49.514439.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T22-06-49.514439.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T22-06-49.514439.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T22-06-49.514439.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-01T22-06-49.514439.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-01T22-06-49.514439.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T22-06-49.514439.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-01T22-06-49.514439.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T22-06-49.514439.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T22-06-49.514439.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T22-06-49.514439.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-01T22-06-49.514439.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T22-06-49.514439.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T22-06-49.514439.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T22-06-49.514439.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T22-06-49.514439.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T22-06-49.514439.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T22-06-49.514439.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T22-06-49.514439.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T22-06-49.514439.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T22-06-49.514439.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T22-06-49.514439.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T22-06-49.514439.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T22-06-49.514439.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T22-06-49.514439.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T22-06-49.514439.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-01T22-06-49.514439.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T22-06-49.514439.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-01T22-06-49.514439.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T22-06-49.514439.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T22-06-49.514439.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T22-06-49.514439.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-01T22-06-49.514439.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-01T22-06-49.514439.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T22-06-49.514439.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T22-06-49.514439.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T22-06-49.514439.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T22-06-49.514439.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-01T22-06-49.514439.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-01T22-06-49.514439.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-01T22-06-49.514439.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T22-06-49.514439.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-01T22-06-49.514439.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T22-06-49.514439.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T22-06-49.514439.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-01T22-06-49.514439.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-01T22-06-49.514439.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-01T22-06-49.514439.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T22-06-49.514439.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-01T22-06-49.514439.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-01T22-06-49.514439.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_01T22_06_49.514439", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T22-06-49.514439.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T22-06-49.514439.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_01T22_06_49.514439", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-01T22-06-49.514439.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-01T22-06-49.514439.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_01T22_06_49.514439", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-01T22-06-49.514439.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-01T22-06-49.514439.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_01T22_06_49.514439", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T22-06-49.514439.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T22-06-49.514439.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_01T22_06_49.514439", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T22-06-49.514439.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T22-06-49.514439.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_01T22_06_49.514439", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-01T22-06-49.514439.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-01T22-06-49.514439.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_01T22_06_49.514439", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T22-06-49.514439.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T22-06-49.514439.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_01T22_06_49.514439", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T22-06-49.514439.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T22-06-49.514439.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_01T22_06_49.514439", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T22-06-49.514439.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T22-06-49.514439.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_01T22_06_49.514439", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T22-06-49.514439.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T22-06-49.514439.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_01T22_06_49.514439", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-01T22-06-49.514439.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-01T22-06-49.514439.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_01T22_06_49.514439", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-01T22-06-49.514439.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-01T22-06-49.514439.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_01T22_06_49.514439", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T22-06-49.514439.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T22-06-49.514439.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_01T22_06_49.514439", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-01T22-06-49.514439.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-01T22-06-49.514439.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_01T22_06_49.514439", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T22-06-49.514439.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T22-06-49.514439.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_01T22_06_49.514439", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T22-06-49.514439.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T22-06-49.514439.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_01T22_06_49.514439", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T22-06-49.514439.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T22-06-49.514439.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_01T22_06_49.514439", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-01T22-06-49.514439.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-01T22-06-49.514439.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_01T22_06_49.514439", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T22-06-49.514439.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T22-06-49.514439.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_01T22_06_49.514439", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T22-06-49.514439.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T22-06-49.514439.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_01T22_06_49.514439", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T22-06-49.514439.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T22-06-49.514439.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_01T22_06_49.514439", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T22-06-49.514439.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T22-06-49.514439.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_01T22_06_49.514439", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T22-06-49.514439.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T22-06-49.514439.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_01T22_06_49.514439", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T22-06-49.514439.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T22-06-49.514439.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_01T22_06_49.514439", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T22-06-49.514439.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T22-06-49.514439.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_01T22_06_49.514439", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T22-06-49.514439.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T22-06-49.514439.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_01T22_06_49.514439", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T22-06-49.514439.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T22-06-49.514439.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_01T22_06_49.514439", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T22-06-49.514439.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T22-06-49.514439.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_01T22_06_49.514439", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T22-06-49.514439.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T22-06-49.514439.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_01T22_06_49.514439", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T22-06-49.514439.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T22-06-49.514439.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_01T22_06_49.514439", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T22-06-49.514439.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T22-06-49.514439.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_01T22_06_49.514439", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T22-06-49.514439.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T22-06-49.514439.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_01T22_06_49.514439", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-01T22-06-49.514439.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-01T22-06-49.514439.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_01T22_06_49.514439", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T22-06-49.514439.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T22-06-49.514439.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_01T22_06_49.514439", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-01T22-06-49.514439.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-01T22-06-49.514439.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_01T22_06_49.514439", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T22-06-49.514439.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T22-06-49.514439.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_01T22_06_49.514439", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T22-06-49.514439.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T22-06-49.514439.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_01T22_06_49.514439", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T22-06-49.514439.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T22-06-49.514439.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_01T22_06_49.514439", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-01T22-06-49.514439.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-01T22-06-49.514439.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_01T22_06_49.514439", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-01T22-06-49.514439.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-01T22-06-49.514439.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_01T22_06_49.514439", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T22-06-49.514439.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T22-06-49.514439.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_01T22_06_49.514439", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T22-06-49.514439.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T22-06-49.514439.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_01T22_06_49.514439", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T22-06-49.514439.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T22-06-49.514439.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_01T22_06_49.514439", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T22-06-49.514439.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T22-06-49.514439.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_01T22_06_49.514439", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-01T22-06-49.514439.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-01T22-06-49.514439.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_01T22_06_49.514439", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-01T22-06-49.514439.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-01T22-06-49.514439.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_01T22_06_49.514439", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-01T22-06-49.514439.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-01T22-06-49.514439.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_01T22_06_49.514439", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T22-06-49.514439.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T22-06-49.514439.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_01T22_06_49.514439", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-01T22-06-49.514439.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-01T22-06-49.514439.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_01T22_06_49.514439", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T22-06-49.514439.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T22-06-49.514439.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_01T22_06_49.514439", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T22-06-49.514439.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T22-06-49.514439.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_01T22_06_49.514439", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-01T22-06-49.514439.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-01T22-06-49.514439.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_01T22_06_49.514439", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-01T22-06-49.514439.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-01T22-06-49.514439.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_01T22_06_49.514439", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-01T22-06-49.514439.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-01T22-06-49.514439.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_01T22_06_49.514439", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T22-06-49.514439.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T22-06-49.514439.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_01T22_06_49.514439", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-01T22-06-49.514439.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-01T22-06-49.514439.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_01T22_06_49.514439", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-01T22-06-49.514439.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-01T22-06-49.514439.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_01T22_06_49.514439", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-01T22-06-49.514439.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-01T22-06-49.514439.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_01T22_06_49.514439", "path": ["**/details_harness|winogrande|5_2024-02-01T22-06-49.514439.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-01T22-06-49.514439.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_01T22_06_49.514439", "path": ["results_2024-02-01T22-06-49.514439.parquet"]}, {"split": "latest", "path": ["results_2024-02-01T22-06-49.514439.parquet"]}]}]}
2024-02-01T22:09:37+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of MRAIRR/mini_7B_dare_v1 Dataset automatically created during the evaluation run of model MRAIRR/mini_7B_dare_v1 on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-02-01T22:06:49.514439(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of MRAIRR/mini_7B_dare_v1\n\n\n\nDataset automatically created during the evaluation run of model MRAIRR/mini_7B_dare_v1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-02-01T22:06:49.514439(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of MRAIRR/mini_7B_dare_v1\n\n\n\nDataset automatically created during the evaluation run of model MRAIRR/mini_7B_dare_v1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-02-01T22:06:49.514439(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
be6774d20c7dc3172cf6696bd1489be78621931c
# Dataset Card for Evaluation run of andysalerno/mistral-sft-v3 <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [andysalerno/mistral-sft-v3](https://huggingface.co/andysalerno/mistral-sft-v3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_andysalerno__mistral-sft-v3", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-02-01T22:26:34.226516](https://huggingface.co/datasets/open-llm-leaderboard/details_andysalerno__mistral-sft-v3/blob/main/results_2024-02-01T22-26-34.226516.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6300888206326156, "acc_stderr": 0.03234208868308631, "acc_norm": 0.6368476388384399, "acc_norm_stderr": 0.033000158601577026, "mc1": 0.3219094247246022, "mc1_stderr": 0.016355567611960397, "mc2": 0.48486973835959385, "mc2_stderr": 0.01523378921503333 }, "harness|arc:challenge|25": { "acc": 0.5665529010238908, "acc_stderr": 0.014481376224558902, "acc_norm": 0.613481228668942, "acc_norm_stderr": 0.014230084761910471 }, "harness|hellaswag|10": { "acc": 0.629555865365465, "acc_stderr": 0.004819367172685962, "acc_norm": 0.8223461461860188, "acc_norm_stderr": 0.003814399385087734 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.31, "acc_stderr": 0.04648231987117316, "acc_norm": 0.31, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.6296296296296297, "acc_stderr": 0.041716541613545426, "acc_norm": 0.6296296296296297, "acc_norm_stderr": 0.041716541613545426 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.6644736842105263, "acc_stderr": 0.03842498559395268, "acc_norm": 0.6644736842105263, "acc_norm_stderr": 0.03842498559395268 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.57, "acc_stderr": 0.04975698519562428, "acc_norm": 0.57, "acc_norm_stderr": 0.04975698519562428 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.6679245283018868, "acc_stderr": 0.02898545565233439, "acc_norm": 0.6679245283018868, "acc_norm_stderr": 0.02898545565233439 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.7222222222222222, "acc_stderr": 0.037455547914624555, "acc_norm": 0.7222222222222222, "acc_norm_stderr": 0.037455547914624555 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.5, "acc_stderr": 0.050251890762960605, "acc_norm": 0.5, "acc_norm_stderr": 0.050251890762960605 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.53, "acc_stderr": 0.05016135580465919, "acc_norm": 0.53, "acc_norm_stderr": 0.05016135580465919 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.35, "acc_stderr": 0.047937248544110196, "acc_norm": 0.35, "acc_norm_stderr": 0.047937248544110196 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6184971098265896, "acc_stderr": 0.03703851193099521, "acc_norm": 0.6184971098265896, "acc_norm_stderr": 0.03703851193099521 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.4019607843137255, "acc_stderr": 0.048786087144669955, "acc_norm": 0.4019607843137255, "acc_norm_stderr": 0.048786087144669955 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.78, "acc_stderr": 0.04163331998932261, "acc_norm": 0.78, "acc_norm_stderr": 0.04163331998932261 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.5361702127659574, "acc_stderr": 0.032600385118357715, "acc_norm": 0.5361702127659574, "acc_norm_stderr": 0.032600385118357715 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.5087719298245614, "acc_stderr": 0.04702880432049615, "acc_norm": 0.5087719298245614, "acc_norm_stderr": 0.04702880432049615 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5724137931034483, "acc_stderr": 0.04122737111370333, "acc_norm": 0.5724137931034483, "acc_norm_stderr": 0.04122737111370333 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.3941798941798942, "acc_stderr": 0.025167982333894143, "acc_norm": 0.3941798941798942, "acc_norm_stderr": 0.025167982333894143 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.3888888888888889, "acc_stderr": 0.04360314860077459, "acc_norm": 0.3888888888888889, "acc_norm_stderr": 0.04360314860077459 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.36, "acc_stderr": 0.048241815132442176, "acc_norm": 0.36, "acc_norm_stderr": 0.048241815132442176 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.7645161290322581, "acc_stderr": 0.02413763242933771, "acc_norm": 0.7645161290322581, "acc_norm_stderr": 0.02413763242933771 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.5123152709359606, "acc_stderr": 0.035169204442208966, "acc_norm": 0.5123152709359606, "acc_norm_stderr": 0.035169204442208966 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.7, "acc_stderr": 0.046056618647183814, "acc_norm": 0.7, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7757575757575758, "acc_stderr": 0.032568666616811015, "acc_norm": 0.7757575757575758, "acc_norm_stderr": 0.032568666616811015 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.803030303030303, "acc_stderr": 0.028335609732463362, "acc_norm": 0.803030303030303, "acc_norm_stderr": 0.028335609732463362 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.8652849740932642, "acc_stderr": 0.02463978909770944, "acc_norm": 0.8652849740932642, "acc_norm_stderr": 0.02463978909770944 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.6615384615384615, "acc_stderr": 0.023991500500313036, "acc_norm": 0.6615384615384615, "acc_norm_stderr": 0.023991500500313036 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.337037037037037, "acc_stderr": 0.028820884666253252, "acc_norm": 0.337037037037037, "acc_norm_stderr": 0.028820884666253252 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.6470588235294118, "acc_stderr": 0.031041941304059278, "acc_norm": 0.6470588235294118, "acc_norm_stderr": 0.031041941304059278 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.32450331125827814, "acc_stderr": 0.03822746937658753, "acc_norm": 0.32450331125827814, "acc_norm_stderr": 0.03822746937658753 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8091743119266055, "acc_stderr": 0.016847676400091098, "acc_norm": 0.8091743119266055, "acc_norm_stderr": 0.016847676400091098 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.5370370370370371, "acc_stderr": 0.03400603625538272, "acc_norm": 0.5370370370370371, "acc_norm_stderr": 0.03400603625538272 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.803921568627451, "acc_stderr": 0.027865942286639318, "acc_norm": 0.803921568627451, "acc_norm_stderr": 0.027865942286639318 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.7763713080168776, "acc_stderr": 0.027123298205229966, "acc_norm": 0.7763713080168776, "acc_norm_stderr": 0.027123298205229966 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6771300448430493, "acc_stderr": 0.031381476375754995, "acc_norm": 0.6771300448430493, "acc_norm_stderr": 0.031381476375754995 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.7786259541984732, "acc_stderr": 0.0364129708131373, "acc_norm": 0.7786259541984732, "acc_norm_stderr": 0.0364129708131373 }, "harness|hendrycksTest-international_law|5": { "acc": 0.8016528925619835, "acc_stderr": 0.036401182719909456, "acc_norm": 0.8016528925619835, "acc_norm_stderr": 0.036401182719909456 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7685185185185185, "acc_stderr": 0.04077494709252627, "acc_norm": 0.7685185185185185, "acc_norm_stderr": 0.04077494709252627 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7975460122699386, "acc_stderr": 0.03157065078911901, "acc_norm": 0.7975460122699386, "acc_norm_stderr": 0.03157065078911901 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.39285714285714285, "acc_stderr": 0.04635550135609976, "acc_norm": 0.39285714285714285, "acc_norm_stderr": 0.04635550135609976 }, "harness|hendrycksTest-management|5": { "acc": 0.7961165048543689, "acc_stderr": 0.03989139859531771, "acc_norm": 0.7961165048543689, "acc_norm_stderr": 0.03989139859531771 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8589743589743589, "acc_stderr": 0.022801382534597528, "acc_norm": 0.8589743589743589, "acc_norm_stderr": 0.022801382534597528 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.73, "acc_stderr": 0.044619604333847394, "acc_norm": 0.73, "acc_norm_stderr": 0.044619604333847394 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8084291187739464, "acc_stderr": 0.014072859310451949, "acc_norm": 0.8084291187739464, "acc_norm_stderr": 0.014072859310451949 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.7138728323699421, "acc_stderr": 0.02433214677913413, "acc_norm": 0.7138728323699421, "acc_norm_stderr": 0.02433214677913413 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.2871508379888268, "acc_stderr": 0.015131608849963759, "acc_norm": 0.2871508379888268, "acc_norm_stderr": 0.015131608849963759 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.7418300653594772, "acc_stderr": 0.025058503316958154, "acc_norm": 0.7418300653594772, "acc_norm_stderr": 0.025058503316958154 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.6913183279742765, "acc_stderr": 0.026236965881153262, "acc_norm": 0.6913183279742765, "acc_norm_stderr": 0.026236965881153262 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.7438271604938271, "acc_stderr": 0.024288533637726095, "acc_norm": 0.7438271604938271, "acc_norm_stderr": 0.024288533637726095 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.475177304964539, "acc_stderr": 0.02979071924382972, "acc_norm": 0.475177304964539, "acc_norm_stderr": 0.02979071924382972 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.44198174706649285, "acc_stderr": 0.01268397251359881, "acc_norm": 0.44198174706649285, "acc_norm_stderr": 0.01268397251359881 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.6801470588235294, "acc_stderr": 0.02833295951403121, "acc_norm": 0.6801470588235294, "acc_norm_stderr": 0.02833295951403121 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6797385620915033, "acc_stderr": 0.018875682938069446, "acc_norm": 0.6797385620915033, "acc_norm_stderr": 0.018875682938069446 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6909090909090909, "acc_stderr": 0.044262946482000985, "acc_norm": 0.6909090909090909, "acc_norm_stderr": 0.044262946482000985 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.7346938775510204, "acc_stderr": 0.028263889943784603, "acc_norm": 0.7346938775510204, "acc_norm_stderr": 0.028263889943784603 }, "harness|hendrycksTest-sociology|5": { "acc": 0.835820895522388, "acc_stderr": 0.026193923544454132, "acc_norm": 0.835820895522388, "acc_norm_stderr": 0.026193923544454132 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.84, "acc_stderr": 0.03684529491774709, "acc_norm": 0.84, "acc_norm_stderr": 0.03684529491774709 }, "harness|hendrycksTest-virology|5": { "acc": 0.5421686746987951, "acc_stderr": 0.0387862677100236, "acc_norm": 0.5421686746987951, "acc_norm_stderr": 0.0387862677100236 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8070175438596491, "acc_stderr": 0.030267457554898458, "acc_norm": 0.8070175438596491, "acc_norm_stderr": 0.030267457554898458 }, "harness|truthfulqa:mc|0": { "mc1": 0.3219094247246022, "mc1_stderr": 0.016355567611960397, "mc2": 0.48486973835959385, "mc2_stderr": 0.01523378921503333 }, "harness|winogrande|5": { "acc": 0.77663772691397, "acc_stderr": 0.011705697565205201 }, "harness|gsm8k|5": { "acc": 0.3244882486732373, "acc_stderr": 0.012896095359768107 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_andysalerno__mistral-sft-v3
[ "region:us" ]
2024-02-01T22:28:55+00:00
{"pretty_name": "Evaluation run of andysalerno/mistral-sft-v3", "dataset_summary": "Dataset automatically created during the evaluation run of model [andysalerno/mistral-sft-v3](https://huggingface.co/andysalerno/mistral-sft-v3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_andysalerno__mistral-sft-v3\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-01T22:26:34.226516](https://huggingface.co/datasets/open-llm-leaderboard/details_andysalerno__mistral-sft-v3/blob/main/results_2024-02-01T22-26-34.226516.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6300888206326156,\n \"acc_stderr\": 0.03234208868308631,\n \"acc_norm\": 0.6368476388384399,\n \"acc_norm_stderr\": 0.033000158601577026,\n \"mc1\": 0.3219094247246022,\n \"mc1_stderr\": 0.016355567611960397,\n \"mc2\": 0.48486973835959385,\n \"mc2_stderr\": 0.01523378921503333\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5665529010238908,\n \"acc_stderr\": 0.014481376224558902,\n \"acc_norm\": 0.613481228668942,\n \"acc_norm_stderr\": 0.014230084761910471\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.629555865365465,\n \"acc_stderr\": 0.004819367172685962,\n \"acc_norm\": 0.8223461461860188,\n \"acc_norm_stderr\": 0.003814399385087734\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6296296296296297,\n \"acc_stderr\": 0.041716541613545426,\n \"acc_norm\": 0.6296296296296297,\n \"acc_norm_stderr\": 0.041716541613545426\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6644736842105263,\n \"acc_stderr\": 0.03842498559395268,\n \"acc_norm\": 0.6644736842105263,\n \"acc_norm_stderr\": 0.03842498559395268\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.57,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6679245283018868,\n \"acc_stderr\": 0.02898545565233439,\n \"acc_norm\": 0.6679245283018868,\n \"acc_norm_stderr\": 0.02898545565233439\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.037455547914624555,\n \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.037455547914624555\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6184971098265896,\n \"acc_stderr\": 0.03703851193099521,\n \"acc_norm\": 0.6184971098265896,\n \"acc_norm_stderr\": 0.03703851193099521\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4019607843137255,\n \"acc_stderr\": 0.048786087144669955,\n \"acc_norm\": 0.4019607843137255,\n \"acc_norm_stderr\": 0.048786087144669955\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.78,\n \"acc_stderr\": 0.04163331998932261,\n \"acc_norm\": 0.78,\n \"acc_norm_stderr\": 0.04163331998932261\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5361702127659574,\n \"acc_stderr\": 0.032600385118357715,\n \"acc_norm\": 0.5361702127659574,\n \"acc_norm_stderr\": 0.032600385118357715\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5087719298245614,\n \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.5087719298245614,\n \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5724137931034483,\n \"acc_stderr\": 0.04122737111370333,\n \"acc_norm\": 0.5724137931034483,\n \"acc_norm_stderr\": 0.04122737111370333\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3941798941798942,\n \"acc_stderr\": 0.025167982333894143,\n \"acc_norm\": 0.3941798941798942,\n \"acc_norm_stderr\": 0.025167982333894143\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3888888888888889,\n \"acc_stderr\": 0.04360314860077459,\n \"acc_norm\": 0.3888888888888889,\n \"acc_norm_stderr\": 0.04360314860077459\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7645161290322581,\n \"acc_stderr\": 0.02413763242933771,\n \"acc_norm\": 0.7645161290322581,\n \"acc_norm_stderr\": 0.02413763242933771\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5123152709359606,\n \"acc_stderr\": 0.035169204442208966,\n \"acc_norm\": 0.5123152709359606,\n \"acc_norm_stderr\": 0.035169204442208966\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.032568666616811015,\n \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.032568666616811015\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.803030303030303,\n \"acc_stderr\": 0.028335609732463362,\n \"acc_norm\": 0.803030303030303,\n \"acc_norm_stderr\": 0.028335609732463362\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8652849740932642,\n \"acc_stderr\": 0.02463978909770944,\n \"acc_norm\": 0.8652849740932642,\n \"acc_norm_stderr\": 0.02463978909770944\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6615384615384615,\n \"acc_stderr\": 0.023991500500313036,\n \"acc_norm\": 0.6615384615384615,\n \"acc_norm_stderr\": 0.023991500500313036\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.337037037037037,\n \"acc_stderr\": 0.028820884666253252,\n \"acc_norm\": 0.337037037037037,\n \"acc_norm_stderr\": 0.028820884666253252\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6470588235294118,\n \"acc_stderr\": 0.031041941304059278,\n \"acc_norm\": 0.6470588235294118,\n \"acc_norm_stderr\": 0.031041941304059278\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.32450331125827814,\n \"acc_stderr\": 0.03822746937658753,\n \"acc_norm\": 0.32450331125827814,\n \"acc_norm_stderr\": 0.03822746937658753\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8091743119266055,\n \"acc_stderr\": 0.016847676400091098,\n \"acc_norm\": 0.8091743119266055,\n \"acc_norm_stderr\": 0.016847676400091098\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5370370370370371,\n \"acc_stderr\": 0.03400603625538272,\n \"acc_norm\": 0.5370370370370371,\n \"acc_norm_stderr\": 0.03400603625538272\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.803921568627451,\n \"acc_stderr\": 0.027865942286639318,\n \"acc_norm\": 0.803921568627451,\n \"acc_norm_stderr\": 0.027865942286639318\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7763713080168776,\n \"acc_stderr\": 0.027123298205229966,\n \"acc_norm\": 0.7763713080168776,\n \"acc_norm_stderr\": 0.027123298205229966\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6771300448430493,\n \"acc_stderr\": 0.031381476375754995,\n \"acc_norm\": 0.6771300448430493,\n \"acc_norm_stderr\": 0.031381476375754995\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7786259541984732,\n \"acc_stderr\": 0.0364129708131373,\n \"acc_norm\": 0.7786259541984732,\n \"acc_norm_stderr\": 0.0364129708131373\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8016528925619835,\n \"acc_stderr\": 0.036401182719909456,\n \"acc_norm\": 0.8016528925619835,\n \"acc_norm_stderr\": 0.036401182719909456\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n \"acc_stderr\": 0.04077494709252627,\n \"acc_norm\": 0.7685185185185185,\n \"acc_norm_stderr\": 0.04077494709252627\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7975460122699386,\n \"acc_stderr\": 0.03157065078911901,\n \"acc_norm\": 0.7975460122699386,\n \"acc_norm_stderr\": 0.03157065078911901\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.39285714285714285,\n \"acc_stderr\": 0.04635550135609976,\n \"acc_norm\": 0.39285714285714285,\n \"acc_norm_stderr\": 0.04635550135609976\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7961165048543689,\n \"acc_stderr\": 0.03989139859531771,\n \"acc_norm\": 0.7961165048543689,\n \"acc_norm_stderr\": 0.03989139859531771\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8589743589743589,\n \"acc_stderr\": 0.022801382534597528,\n \"acc_norm\": 0.8589743589743589,\n \"acc_norm_stderr\": 0.022801382534597528\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8084291187739464,\n \"acc_stderr\": 0.014072859310451949,\n \"acc_norm\": 0.8084291187739464,\n \"acc_norm_stderr\": 0.014072859310451949\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7138728323699421,\n \"acc_stderr\": 0.02433214677913413,\n \"acc_norm\": 0.7138728323699421,\n \"acc_norm_stderr\": 0.02433214677913413\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2871508379888268,\n \"acc_stderr\": 0.015131608849963759,\n \"acc_norm\": 0.2871508379888268,\n \"acc_norm_stderr\": 0.015131608849963759\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7418300653594772,\n \"acc_stderr\": 0.025058503316958154,\n \"acc_norm\": 0.7418300653594772,\n \"acc_norm_stderr\": 0.025058503316958154\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6913183279742765,\n \"acc_stderr\": 0.026236965881153262,\n \"acc_norm\": 0.6913183279742765,\n \"acc_norm_stderr\": 0.026236965881153262\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7438271604938271,\n \"acc_stderr\": 0.024288533637726095,\n \"acc_norm\": 0.7438271604938271,\n \"acc_norm_stderr\": 0.024288533637726095\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.475177304964539,\n \"acc_stderr\": 0.02979071924382972,\n \"acc_norm\": 0.475177304964539,\n \"acc_norm_stderr\": 0.02979071924382972\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.44198174706649285,\n \"acc_stderr\": 0.01268397251359881,\n \"acc_norm\": 0.44198174706649285,\n \"acc_norm_stderr\": 0.01268397251359881\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6801470588235294,\n \"acc_stderr\": 0.02833295951403121,\n \"acc_norm\": 0.6801470588235294,\n \"acc_norm_stderr\": 0.02833295951403121\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6797385620915033,\n \"acc_stderr\": 0.018875682938069446,\n \"acc_norm\": 0.6797385620915033,\n \"acc_norm_stderr\": 0.018875682938069446\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6909090909090909,\n \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.6909090909090909,\n \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7346938775510204,\n \"acc_stderr\": 0.028263889943784603,\n \"acc_norm\": 0.7346938775510204,\n \"acc_norm_stderr\": 0.028263889943784603\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n \"acc_stderr\": 0.026193923544454132,\n \"acc_norm\": 0.835820895522388,\n \"acc_norm_stderr\": 0.026193923544454132\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774709,\n \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774709\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5421686746987951,\n \"acc_stderr\": 0.0387862677100236,\n \"acc_norm\": 0.5421686746987951,\n \"acc_norm_stderr\": 0.0387862677100236\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8070175438596491,\n \"acc_stderr\": 0.030267457554898458,\n \"acc_norm\": 0.8070175438596491,\n \"acc_norm_stderr\": 0.030267457554898458\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3219094247246022,\n \"mc1_stderr\": 0.016355567611960397,\n \"mc2\": 0.48486973835959385,\n \"mc2_stderr\": 0.01523378921503333\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.77663772691397,\n \"acc_stderr\": 0.011705697565205201\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.3244882486732373,\n \"acc_stderr\": 0.012896095359768107\n }\n}\n```", "repo_url": "https://huggingface.co/andysalerno/mistral-sft-v3", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_01T22_26_34.226516", "path": ["**/details_harness|arc:challenge|25_2024-02-01T22-26-34.226516.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-01T22-26-34.226516.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_01T22_26_34.226516", "path": ["**/details_harness|gsm8k|5_2024-02-01T22-26-34.226516.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-01T22-26-34.226516.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_01T22_26_34.226516", "path": ["**/details_harness|hellaswag|10_2024-02-01T22-26-34.226516.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-01T22-26-34.226516.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_01T22_26_34.226516", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T22-26-34.226516.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-01T22-26-34.226516.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-01T22-26-34.226516.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T22-26-34.226516.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T22-26-34.226516.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-01T22-26-34.226516.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T22-26-34.226516.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T22-26-34.226516.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T22-26-34.226516.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T22-26-34.226516.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-01T22-26-34.226516.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-01T22-26-34.226516.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T22-26-34.226516.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-01T22-26-34.226516.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T22-26-34.226516.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T22-26-34.226516.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T22-26-34.226516.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-01T22-26-34.226516.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T22-26-34.226516.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T22-26-34.226516.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T22-26-34.226516.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T22-26-34.226516.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T22-26-34.226516.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T22-26-34.226516.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T22-26-34.226516.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T22-26-34.226516.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T22-26-34.226516.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T22-26-34.226516.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T22-26-34.226516.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T22-26-34.226516.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T22-26-34.226516.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T22-26-34.226516.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-01T22-26-34.226516.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T22-26-34.226516.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-01T22-26-34.226516.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T22-26-34.226516.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T22-26-34.226516.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T22-26-34.226516.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-01T22-26-34.226516.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-01T22-26-34.226516.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T22-26-34.226516.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T22-26-34.226516.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T22-26-34.226516.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T22-26-34.226516.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-01T22-26-34.226516.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-01T22-26-34.226516.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-01T22-26-34.226516.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T22-26-34.226516.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-01T22-26-34.226516.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T22-26-34.226516.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T22-26-34.226516.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-01T22-26-34.226516.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-01T22-26-34.226516.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-01T22-26-34.226516.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T22-26-34.226516.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-01T22-26-34.226516.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-01T22-26-34.226516.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T22-26-34.226516.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-01T22-26-34.226516.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-01T22-26-34.226516.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T22-26-34.226516.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T22-26-34.226516.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-01T22-26-34.226516.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T22-26-34.226516.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T22-26-34.226516.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T22-26-34.226516.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T22-26-34.226516.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-01T22-26-34.226516.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-01T22-26-34.226516.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T22-26-34.226516.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-01T22-26-34.226516.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T22-26-34.226516.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T22-26-34.226516.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T22-26-34.226516.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-01T22-26-34.226516.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T22-26-34.226516.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T22-26-34.226516.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T22-26-34.226516.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T22-26-34.226516.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T22-26-34.226516.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T22-26-34.226516.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T22-26-34.226516.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T22-26-34.226516.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T22-26-34.226516.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T22-26-34.226516.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T22-26-34.226516.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T22-26-34.226516.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T22-26-34.226516.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T22-26-34.226516.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-01T22-26-34.226516.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T22-26-34.226516.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-01T22-26-34.226516.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T22-26-34.226516.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T22-26-34.226516.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T22-26-34.226516.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-01T22-26-34.226516.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-01T22-26-34.226516.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T22-26-34.226516.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T22-26-34.226516.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T22-26-34.226516.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T22-26-34.226516.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-01T22-26-34.226516.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-01T22-26-34.226516.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-01T22-26-34.226516.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T22-26-34.226516.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-01T22-26-34.226516.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T22-26-34.226516.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T22-26-34.226516.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-01T22-26-34.226516.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-01T22-26-34.226516.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-01T22-26-34.226516.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T22-26-34.226516.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-01T22-26-34.226516.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-01T22-26-34.226516.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_01T22_26_34.226516", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T22-26-34.226516.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T22-26-34.226516.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_01T22_26_34.226516", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-01T22-26-34.226516.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-01T22-26-34.226516.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_01T22_26_34.226516", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-01T22-26-34.226516.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-01T22-26-34.226516.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_01T22_26_34.226516", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T22-26-34.226516.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T22-26-34.226516.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_01T22_26_34.226516", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T22-26-34.226516.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T22-26-34.226516.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_01T22_26_34.226516", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-01T22-26-34.226516.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-01T22-26-34.226516.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_01T22_26_34.226516", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T22-26-34.226516.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T22-26-34.226516.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_01T22_26_34.226516", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T22-26-34.226516.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T22-26-34.226516.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_01T22_26_34.226516", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T22-26-34.226516.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T22-26-34.226516.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_01T22_26_34.226516", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T22-26-34.226516.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T22-26-34.226516.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_01T22_26_34.226516", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-01T22-26-34.226516.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-01T22-26-34.226516.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_01T22_26_34.226516", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-01T22-26-34.226516.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-01T22-26-34.226516.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_01T22_26_34.226516", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T22-26-34.226516.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T22-26-34.226516.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_01T22_26_34.226516", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-01T22-26-34.226516.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-01T22-26-34.226516.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_01T22_26_34.226516", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T22-26-34.226516.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T22-26-34.226516.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_01T22_26_34.226516", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T22-26-34.226516.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T22-26-34.226516.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_01T22_26_34.226516", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T22-26-34.226516.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T22-26-34.226516.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_01T22_26_34.226516", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-01T22-26-34.226516.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-01T22-26-34.226516.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_01T22_26_34.226516", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T22-26-34.226516.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T22-26-34.226516.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_01T22_26_34.226516", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T22-26-34.226516.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T22-26-34.226516.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_01T22_26_34.226516", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T22-26-34.226516.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T22-26-34.226516.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_01T22_26_34.226516", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T22-26-34.226516.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T22-26-34.226516.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_01T22_26_34.226516", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T22-26-34.226516.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T22-26-34.226516.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_01T22_26_34.226516", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T22-26-34.226516.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T22-26-34.226516.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_01T22_26_34.226516", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T22-26-34.226516.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T22-26-34.226516.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_01T22_26_34.226516", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T22-26-34.226516.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T22-26-34.226516.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_01T22_26_34.226516", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T22-26-34.226516.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T22-26-34.226516.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_01T22_26_34.226516", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T22-26-34.226516.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T22-26-34.226516.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_01T22_26_34.226516", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T22-26-34.226516.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T22-26-34.226516.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_01T22_26_34.226516", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T22-26-34.226516.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T22-26-34.226516.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_01T22_26_34.226516", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T22-26-34.226516.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T22-26-34.226516.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_01T22_26_34.226516", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T22-26-34.226516.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T22-26-34.226516.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_01T22_26_34.226516", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-01T22-26-34.226516.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-01T22-26-34.226516.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_01T22_26_34.226516", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T22-26-34.226516.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T22-26-34.226516.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_01T22_26_34.226516", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-01T22-26-34.226516.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-01T22-26-34.226516.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_01T22_26_34.226516", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T22-26-34.226516.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T22-26-34.226516.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_01T22_26_34.226516", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T22-26-34.226516.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T22-26-34.226516.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_01T22_26_34.226516", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T22-26-34.226516.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T22-26-34.226516.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_01T22_26_34.226516", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-01T22-26-34.226516.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-01T22-26-34.226516.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_01T22_26_34.226516", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-01T22-26-34.226516.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-01T22-26-34.226516.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_01T22_26_34.226516", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T22-26-34.226516.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T22-26-34.226516.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_01T22_26_34.226516", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T22-26-34.226516.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T22-26-34.226516.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_01T22_26_34.226516", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T22-26-34.226516.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T22-26-34.226516.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_01T22_26_34.226516", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T22-26-34.226516.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T22-26-34.226516.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_01T22_26_34.226516", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-01T22-26-34.226516.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-01T22-26-34.226516.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_01T22_26_34.226516", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-01T22-26-34.226516.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-01T22-26-34.226516.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_01T22_26_34.226516", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-01T22-26-34.226516.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-01T22-26-34.226516.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_01T22_26_34.226516", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T22-26-34.226516.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T22-26-34.226516.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_01T22_26_34.226516", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-01T22-26-34.226516.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-01T22-26-34.226516.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_01T22_26_34.226516", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T22-26-34.226516.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T22-26-34.226516.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_01T22_26_34.226516", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T22-26-34.226516.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T22-26-34.226516.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_01T22_26_34.226516", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-01T22-26-34.226516.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-01T22-26-34.226516.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_01T22_26_34.226516", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-01T22-26-34.226516.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-01T22-26-34.226516.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_01T22_26_34.226516", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-01T22-26-34.226516.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-01T22-26-34.226516.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_01T22_26_34.226516", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T22-26-34.226516.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T22-26-34.226516.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_01T22_26_34.226516", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-01T22-26-34.226516.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-01T22-26-34.226516.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_01T22_26_34.226516", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-01T22-26-34.226516.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-01T22-26-34.226516.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_01T22_26_34.226516", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-01T22-26-34.226516.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-01T22-26-34.226516.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_01T22_26_34.226516", "path": ["**/details_harness|winogrande|5_2024-02-01T22-26-34.226516.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-01T22-26-34.226516.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_01T22_26_34.226516", "path": ["results_2024-02-01T22-26-34.226516.parquet"]}, {"split": "latest", "path": ["results_2024-02-01T22-26-34.226516.parquet"]}]}]}
2024-02-01T22:29:23+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of andysalerno/mistral-sft-v3 Dataset automatically created during the evaluation run of model andysalerno/mistral-sft-v3 on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-02-01T22:26:34.226516(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of andysalerno/mistral-sft-v3\n\n\n\nDataset automatically created during the evaluation run of model andysalerno/mistral-sft-v3 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-02-01T22:26:34.226516(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of andysalerno/mistral-sft-v3\n\n\n\nDataset automatically created during the evaluation run of model andysalerno/mistral-sft-v3 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-02-01T22:26:34.226516(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
be6d153ffd5ed690df7e837b7f722f9b70062975
# Dataset Card for Evaluation run of gmonsoon/TinyUltra-4x1.1B-Base-Alpha <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [gmonsoon/TinyUltra-4x1.1B-Base-Alpha](https://huggingface.co/gmonsoon/TinyUltra-4x1.1B-Base-Alpha) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_gmonsoon__TinyUltra-4x1.1B-Base-Alpha", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-02-01T22:31:22.257251](https://huggingface.co/datasets/open-llm-leaderboard/details_gmonsoon__TinyUltra-4x1.1B-Base-Alpha/blob/main/results_2024-02-01T22-31-22.257251.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.26201452650295837, "acc_stderr": 0.030950575098959248, "acc_norm": 0.26190159146597486, "acc_norm_stderr": 0.03169834440202644, "mc1": 0.2350061199510404, "mc1_stderr": 0.014843061507731618, "mc2": 0.3758799861882878, "mc2_stderr": 0.014070883279660485 }, "harness|arc:challenge|25": { "acc": 0.3447098976109215, "acc_stderr": 0.01388881628678211, "acc_norm": 0.34897610921501704, "acc_norm_stderr": 0.013928933461382504 }, "harness|hellaswag|10": { "acc": 0.46594303923521213, "acc_stderr": 0.004978192893406287, "acc_norm": 0.6142202748456482, "acc_norm_stderr": 0.004857840934549174 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.25, "acc_stderr": 0.04351941398892446, "acc_norm": 0.25, "acc_norm_stderr": 0.04351941398892446 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.2222222222222222, "acc_stderr": 0.035914440841969694, "acc_norm": 0.2222222222222222, "acc_norm_stderr": 0.035914440841969694 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.17763157894736842, "acc_stderr": 0.03110318238312338, "acc_norm": 0.17763157894736842, "acc_norm_stderr": 0.03110318238312338 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.26, "acc_stderr": 0.04408440022768079, "acc_norm": 0.26, "acc_norm_stderr": 0.04408440022768079 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.27169811320754716, "acc_stderr": 0.027377706624670713, "acc_norm": 0.27169811320754716, "acc_norm_stderr": 0.027377706624670713 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.2569444444444444, "acc_stderr": 0.03653946969442099, "acc_norm": 0.2569444444444444, "acc_norm_stderr": 0.03653946969442099 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.27, "acc_stderr": 0.044619604333847394, "acc_norm": 0.27, "acc_norm_stderr": 0.044619604333847394 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.24, "acc_stderr": 0.04292346959909282, "acc_norm": 0.24, "acc_norm_stderr": 0.04292346959909282 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.26, "acc_stderr": 0.0440844002276808, "acc_norm": 0.26, "acc_norm_stderr": 0.0440844002276808 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.2138728323699422, "acc_stderr": 0.03126511206173043, "acc_norm": 0.2138728323699422, "acc_norm_stderr": 0.03126511206173043 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.19607843137254902, "acc_stderr": 0.03950581861179961, "acc_norm": 0.19607843137254902, "acc_norm_stderr": 0.03950581861179961 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.25, "acc_stderr": 0.04351941398892446, "acc_norm": 0.25, "acc_norm_stderr": 0.04351941398892446 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.32340425531914896, "acc_stderr": 0.030579442773610334, "acc_norm": 0.32340425531914896, "acc_norm_stderr": 0.030579442773610334 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.2631578947368421, "acc_stderr": 0.0414243971948936, "acc_norm": 0.2631578947368421, "acc_norm_stderr": 0.0414243971948936 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.23448275862068965, "acc_stderr": 0.035306258743465914, "acc_norm": 0.23448275862068965, "acc_norm_stderr": 0.035306258743465914 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.2566137566137566, "acc_stderr": 0.022494510767503154, "acc_norm": 0.2566137566137566, "acc_norm_stderr": 0.022494510767503154 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.20634920634920634, "acc_stderr": 0.036196045241242515, "acc_norm": 0.20634920634920634, "acc_norm_stderr": 0.036196045241242515 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.3, "acc_stderr": 0.046056618647183814, "acc_norm": 0.3, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.26129032258064516, "acc_stderr": 0.024993053397764826, "acc_norm": 0.26129032258064516, "acc_norm_stderr": 0.024993053397764826 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.27586206896551724, "acc_stderr": 0.031447125816782405, "acc_norm": 0.27586206896551724, "acc_norm_stderr": 0.031447125816782405 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.23, "acc_stderr": 0.04229525846816505, "acc_norm": 0.23, "acc_norm_stderr": 0.04229525846816505 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.24848484848484848, "acc_stderr": 0.03374402644139404, "acc_norm": 0.24848484848484848, "acc_norm_stderr": 0.03374402644139404 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.21717171717171718, "acc_stderr": 0.029376616484945637, "acc_norm": 0.21717171717171718, "acc_norm_stderr": 0.029376616484945637 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.22279792746113988, "acc_stderr": 0.03003114797764154, "acc_norm": 0.22279792746113988, "acc_norm_stderr": 0.03003114797764154 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.23333333333333334, "acc_stderr": 0.021444547301560486, "acc_norm": 0.23333333333333334, "acc_norm_stderr": 0.021444547301560486 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.26296296296296295, "acc_stderr": 0.026842057873833706, "acc_norm": 0.26296296296296295, "acc_norm_stderr": 0.026842057873833706 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.2689075630252101, "acc_stderr": 0.02880139219363128, "acc_norm": 0.2689075630252101, "acc_norm_stderr": 0.02880139219363128 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.2052980132450331, "acc_stderr": 0.03297986648473835, "acc_norm": 0.2052980132450331, "acc_norm_stderr": 0.03297986648473835 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.23669724770642203, "acc_stderr": 0.01822407811729908, "acc_norm": 0.23669724770642203, "acc_norm_stderr": 0.01822407811729908 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.35648148148148145, "acc_stderr": 0.032664783315272714, "acc_norm": 0.35648148148148145, "acc_norm_stderr": 0.032664783315272714 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.22549019607843138, "acc_stderr": 0.02933116229425173, "acc_norm": 0.22549019607843138, "acc_norm_stderr": 0.02933116229425173 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.28270042194092826, "acc_stderr": 0.029312814153955934, "acc_norm": 0.28270042194092826, "acc_norm_stderr": 0.029312814153955934 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.3721973094170404, "acc_stderr": 0.032443052830087304, "acc_norm": 0.3721973094170404, "acc_norm_stderr": 0.032443052830087304 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.22900763358778625, "acc_stderr": 0.036853466317118506, "acc_norm": 0.22900763358778625, "acc_norm_stderr": 0.036853466317118506 }, "harness|hendrycksTest-international_law|5": { "acc": 0.256198347107438, "acc_stderr": 0.03984979653302871, "acc_norm": 0.256198347107438, "acc_norm_stderr": 0.03984979653302871 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.3055555555555556, "acc_stderr": 0.044531975073749834, "acc_norm": 0.3055555555555556, "acc_norm_stderr": 0.044531975073749834 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.25766871165644173, "acc_stderr": 0.03436150827846917, "acc_norm": 0.25766871165644173, "acc_norm_stderr": 0.03436150827846917 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.30357142857142855, "acc_stderr": 0.04364226155841044, "acc_norm": 0.30357142857142855, "acc_norm_stderr": 0.04364226155841044 }, "harness|hendrycksTest-management|5": { "acc": 0.2621359223300971, "acc_stderr": 0.04354631077260597, "acc_norm": 0.2621359223300971, "acc_norm_stderr": 0.04354631077260597 }, "harness|hendrycksTest-marketing|5": { "acc": 0.2564102564102564, "acc_stderr": 0.028605953702004253, "acc_norm": 0.2564102564102564, "acc_norm_stderr": 0.028605953702004253 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.26, "acc_stderr": 0.044084400227680794, "acc_norm": 0.26, "acc_norm_stderr": 0.044084400227680794 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.2886334610472541, "acc_stderr": 0.016203792703197804, "acc_norm": 0.2886334610472541, "acc_norm_stderr": 0.016203792703197804 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.24566473988439305, "acc_stderr": 0.02317629820399201, "acc_norm": 0.24566473988439305, "acc_norm_stderr": 0.02317629820399201 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.24804469273743016, "acc_stderr": 0.014444157808261445, "acc_norm": 0.24804469273743016, "acc_norm_stderr": 0.014444157808261445 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.24509803921568626, "acc_stderr": 0.024630048979824765, "acc_norm": 0.24509803921568626, "acc_norm_stderr": 0.024630048979824765 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.2733118971061093, "acc_stderr": 0.02531176597542612, "acc_norm": 0.2733118971061093, "acc_norm_stderr": 0.02531176597542612 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.2654320987654321, "acc_stderr": 0.024569223600460845, "acc_norm": 0.2654320987654321, "acc_norm_stderr": 0.024569223600460845 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.24822695035460993, "acc_stderr": 0.0257700156442904, "acc_norm": 0.24822695035460993, "acc_norm_stderr": 0.0257700156442904 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.23859191655801826, "acc_stderr": 0.0108859297420022, "acc_norm": 0.23859191655801826, "acc_norm_stderr": 0.0108859297420022 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.20220588235294118, "acc_stderr": 0.02439819298665492, "acc_norm": 0.20220588235294118, "acc_norm_stderr": 0.02439819298665492 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.25980392156862747, "acc_stderr": 0.01774089950917779, "acc_norm": 0.25980392156862747, "acc_norm_stderr": 0.01774089950917779 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.34545454545454546, "acc_stderr": 0.04554619617541054, "acc_norm": 0.34545454545454546, "acc_norm_stderr": 0.04554619617541054 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.16326530612244897, "acc_stderr": 0.023661699177098622, "acc_norm": 0.16326530612244897, "acc_norm_stderr": 0.023661699177098622 }, "harness|hendrycksTest-sociology|5": { "acc": 0.23880597014925373, "acc_stderr": 0.030147775935409224, "acc_norm": 0.23880597014925373, "acc_norm_stderr": 0.030147775935409224 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.22, "acc_stderr": 0.041633319989322695, "acc_norm": 0.22, "acc_norm_stderr": 0.041633319989322695 }, "harness|hendrycksTest-virology|5": { "acc": 0.3132530120481928, "acc_stderr": 0.036108050180310235, "acc_norm": 0.3132530120481928, "acc_norm_stderr": 0.036108050180310235 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.21052631578947367, "acc_stderr": 0.03126781714663179, "acc_norm": 0.21052631578947367, "acc_norm_stderr": 0.03126781714663179 }, "harness|truthfulqa:mc|0": { "mc1": 0.2350061199510404, "mc1_stderr": 0.014843061507731618, "mc2": 0.3758799861882878, "mc2_stderr": 0.014070883279660485 }, "harness|winogrande|5": { "acc": 0.6574585635359116, "acc_stderr": 0.013337483579075929 }, "harness|gsm8k|5": { "acc": 0.02577710386656558, "acc_stderr": 0.004365042953621804 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_gmonsoon__TinyUltra-4x1.1B-Base-Alpha
[ "region:us" ]
2024-02-01T22:33:10+00:00
{"pretty_name": "Evaluation run of gmonsoon/TinyUltra-4x1.1B-Base-Alpha", "dataset_summary": "Dataset automatically created during the evaluation run of model [gmonsoon/TinyUltra-4x1.1B-Base-Alpha](https://huggingface.co/gmonsoon/TinyUltra-4x1.1B-Base-Alpha) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_gmonsoon__TinyUltra-4x1.1B-Base-Alpha\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-01T22:31:22.257251](https://huggingface.co/datasets/open-llm-leaderboard/details_gmonsoon__TinyUltra-4x1.1B-Base-Alpha/blob/main/results_2024-02-01T22-31-22.257251.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.26201452650295837,\n \"acc_stderr\": 0.030950575098959248,\n \"acc_norm\": 0.26190159146597486,\n \"acc_norm_stderr\": 0.03169834440202644,\n \"mc1\": 0.2350061199510404,\n \"mc1_stderr\": 0.014843061507731618,\n \"mc2\": 0.3758799861882878,\n \"mc2_stderr\": 0.014070883279660485\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.3447098976109215,\n \"acc_stderr\": 0.01388881628678211,\n \"acc_norm\": 0.34897610921501704,\n \"acc_norm_stderr\": 0.013928933461382504\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.46594303923521213,\n \"acc_stderr\": 0.004978192893406287,\n \"acc_norm\": 0.6142202748456482,\n \"acc_norm_stderr\": 0.004857840934549174\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.2222222222222222,\n \"acc_stderr\": 0.035914440841969694,\n \"acc_norm\": 0.2222222222222222,\n \"acc_norm_stderr\": 0.035914440841969694\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.17763157894736842,\n \"acc_stderr\": 0.03110318238312338,\n \"acc_norm\": 0.17763157894736842,\n \"acc_norm_stderr\": 0.03110318238312338\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768079,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768079\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.27169811320754716,\n \"acc_stderr\": 0.027377706624670713,\n \"acc_norm\": 0.27169811320754716,\n \"acc_norm_stderr\": 0.027377706624670713\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2569444444444444,\n \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.2569444444444444,\n \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909282,\n \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909282\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.2138728323699422,\n \"acc_stderr\": 0.03126511206173043,\n \"acc_norm\": 0.2138728323699422,\n \"acc_norm_stderr\": 0.03126511206173043\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.19607843137254902,\n \"acc_stderr\": 0.03950581861179961,\n \"acc_norm\": 0.19607843137254902,\n \"acc_norm_stderr\": 0.03950581861179961\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.32340425531914896,\n \"acc_stderr\": 0.030579442773610334,\n \"acc_norm\": 0.32340425531914896,\n \"acc_norm_stderr\": 0.030579442773610334\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2631578947368421,\n \"acc_stderr\": 0.0414243971948936,\n \"acc_norm\": 0.2631578947368421,\n \"acc_norm_stderr\": 0.0414243971948936\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.23448275862068965,\n \"acc_stderr\": 0.035306258743465914,\n \"acc_norm\": 0.23448275862068965,\n \"acc_norm_stderr\": 0.035306258743465914\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.2566137566137566,\n \"acc_stderr\": 0.022494510767503154,\n \"acc_norm\": 0.2566137566137566,\n \"acc_norm_stderr\": 0.022494510767503154\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.20634920634920634,\n \"acc_stderr\": 0.036196045241242515,\n \"acc_norm\": 0.20634920634920634,\n \"acc_norm_stderr\": 0.036196045241242515\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.26129032258064516,\n \"acc_stderr\": 0.024993053397764826,\n \"acc_norm\": 0.26129032258064516,\n \"acc_norm_stderr\": 0.024993053397764826\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.27586206896551724,\n \"acc_stderr\": 0.031447125816782405,\n \"acc_norm\": 0.27586206896551724,\n \"acc_norm_stderr\": 0.031447125816782405\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816505,\n \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816505\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.24848484848484848,\n \"acc_stderr\": 0.03374402644139404,\n \"acc_norm\": 0.24848484848484848,\n \"acc_norm_stderr\": 0.03374402644139404\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.21717171717171718,\n \"acc_stderr\": 0.029376616484945637,\n \"acc_norm\": 0.21717171717171718,\n \"acc_norm_stderr\": 0.029376616484945637\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.22279792746113988,\n \"acc_stderr\": 0.03003114797764154,\n \"acc_norm\": 0.22279792746113988,\n \"acc_norm_stderr\": 0.03003114797764154\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.23333333333333334,\n \"acc_stderr\": 0.021444547301560486,\n \"acc_norm\": 0.23333333333333334,\n \"acc_norm_stderr\": 0.021444547301560486\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.26296296296296295,\n \"acc_stderr\": 0.026842057873833706,\n \"acc_norm\": 0.26296296296296295,\n \"acc_norm_stderr\": 0.026842057873833706\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.2689075630252101,\n \"acc_stderr\": 0.02880139219363128,\n \"acc_norm\": 0.2689075630252101,\n \"acc_norm_stderr\": 0.02880139219363128\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.2052980132450331,\n \"acc_stderr\": 0.03297986648473835,\n \"acc_norm\": 0.2052980132450331,\n \"acc_norm_stderr\": 0.03297986648473835\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.23669724770642203,\n \"acc_stderr\": 0.01822407811729908,\n \"acc_norm\": 0.23669724770642203,\n \"acc_norm_stderr\": 0.01822407811729908\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.35648148148148145,\n \"acc_stderr\": 0.032664783315272714,\n \"acc_norm\": 0.35648148148148145,\n \"acc_norm_stderr\": 0.032664783315272714\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.02933116229425173,\n \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.02933116229425173\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.28270042194092826,\n \"acc_stderr\": 0.029312814153955934,\n \"acc_norm\": 0.28270042194092826,\n \"acc_norm_stderr\": 0.029312814153955934\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.3721973094170404,\n \"acc_stderr\": 0.032443052830087304,\n \"acc_norm\": 0.3721973094170404,\n \"acc_norm_stderr\": 0.032443052830087304\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.22900763358778625,\n \"acc_stderr\": 0.036853466317118506,\n \"acc_norm\": 0.22900763358778625,\n \"acc_norm_stderr\": 0.036853466317118506\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.256198347107438,\n \"acc_stderr\": 0.03984979653302871,\n \"acc_norm\": 0.256198347107438,\n \"acc_norm_stderr\": 0.03984979653302871\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.3055555555555556,\n \"acc_stderr\": 0.044531975073749834,\n \"acc_norm\": 0.3055555555555556,\n \"acc_norm_stderr\": 0.044531975073749834\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.25766871165644173,\n \"acc_stderr\": 0.03436150827846917,\n \"acc_norm\": 0.25766871165644173,\n \"acc_norm_stderr\": 0.03436150827846917\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.30357142857142855,\n \"acc_stderr\": 0.04364226155841044,\n \"acc_norm\": 0.30357142857142855,\n \"acc_norm_stderr\": 0.04364226155841044\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.2621359223300971,\n \"acc_stderr\": 0.04354631077260597,\n \"acc_norm\": 0.2621359223300971,\n \"acc_norm_stderr\": 0.04354631077260597\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2564102564102564,\n \"acc_stderr\": 0.028605953702004253,\n \"acc_norm\": 0.2564102564102564,\n \"acc_norm_stderr\": 0.028605953702004253\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.044084400227680794,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.044084400227680794\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.2886334610472541,\n \"acc_stderr\": 0.016203792703197804,\n \"acc_norm\": 0.2886334610472541,\n \"acc_norm_stderr\": 0.016203792703197804\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.24566473988439305,\n \"acc_stderr\": 0.02317629820399201,\n \"acc_norm\": 0.24566473988439305,\n \"acc_norm_stderr\": 0.02317629820399201\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24804469273743016,\n \"acc_stderr\": 0.014444157808261445,\n \"acc_norm\": 0.24804469273743016,\n \"acc_norm_stderr\": 0.014444157808261445\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.24509803921568626,\n \"acc_stderr\": 0.024630048979824765,\n \"acc_norm\": 0.24509803921568626,\n \"acc_norm_stderr\": 0.024630048979824765\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2733118971061093,\n \"acc_stderr\": 0.02531176597542612,\n \"acc_norm\": 0.2733118971061093,\n \"acc_norm_stderr\": 0.02531176597542612\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.2654320987654321,\n \"acc_stderr\": 0.024569223600460845,\n \"acc_norm\": 0.2654320987654321,\n \"acc_norm_stderr\": 0.024569223600460845\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.24822695035460993,\n \"acc_stderr\": 0.0257700156442904,\n \"acc_norm\": 0.24822695035460993,\n \"acc_norm_stderr\": 0.0257700156442904\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.23859191655801826,\n \"acc_stderr\": 0.0108859297420022,\n \"acc_norm\": 0.23859191655801826,\n \"acc_norm_stderr\": 0.0108859297420022\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.20220588235294118,\n \"acc_stderr\": 0.02439819298665492,\n \"acc_norm\": 0.20220588235294118,\n \"acc_norm_stderr\": 0.02439819298665492\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.25980392156862747,\n \"acc_stderr\": 0.01774089950917779,\n \"acc_norm\": 0.25980392156862747,\n \"acc_norm_stderr\": 0.01774089950917779\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.34545454545454546,\n \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.34545454545454546,\n \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.16326530612244897,\n \"acc_stderr\": 0.023661699177098622,\n \"acc_norm\": 0.16326530612244897,\n \"acc_norm_stderr\": 0.023661699177098622\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.23880597014925373,\n \"acc_stderr\": 0.030147775935409224,\n \"acc_norm\": 0.23880597014925373,\n \"acc_norm_stderr\": 0.030147775935409224\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.22,\n \"acc_stderr\": 0.041633319989322695,\n \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.041633319989322695\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3132530120481928,\n \"acc_stderr\": 0.036108050180310235,\n \"acc_norm\": 0.3132530120481928,\n \"acc_norm_stderr\": 0.036108050180310235\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.21052631578947367,\n \"acc_stderr\": 0.03126781714663179,\n \"acc_norm\": 0.21052631578947367,\n \"acc_norm_stderr\": 0.03126781714663179\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2350061199510404,\n \"mc1_stderr\": 0.014843061507731618,\n \"mc2\": 0.3758799861882878,\n \"mc2_stderr\": 0.014070883279660485\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.6574585635359116,\n \"acc_stderr\": 0.013337483579075929\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.02577710386656558,\n \"acc_stderr\": 0.004365042953621804\n }\n}\n```", "repo_url": "https://huggingface.co/gmonsoon/TinyUltra-4x1.1B-Base-Alpha", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_01T22_31_22.257251", "path": ["**/details_harness|arc:challenge|25_2024-02-01T22-31-22.257251.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-01T22-31-22.257251.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_01T22_31_22.257251", "path": ["**/details_harness|gsm8k|5_2024-02-01T22-31-22.257251.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-01T22-31-22.257251.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_01T22_31_22.257251", "path": ["**/details_harness|hellaswag|10_2024-02-01T22-31-22.257251.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-01T22-31-22.257251.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_01T22_31_22.257251", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T22-31-22.257251.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-01T22-31-22.257251.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-01T22-31-22.257251.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T22-31-22.257251.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T22-31-22.257251.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-01T22-31-22.257251.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T22-31-22.257251.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T22-31-22.257251.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T22-31-22.257251.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T22-31-22.257251.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-01T22-31-22.257251.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-01T22-31-22.257251.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T22-31-22.257251.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-01T22-31-22.257251.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T22-31-22.257251.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T22-31-22.257251.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T22-31-22.257251.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-01T22-31-22.257251.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T22-31-22.257251.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T22-31-22.257251.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T22-31-22.257251.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T22-31-22.257251.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T22-31-22.257251.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T22-31-22.257251.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T22-31-22.257251.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T22-31-22.257251.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T22-31-22.257251.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T22-31-22.257251.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T22-31-22.257251.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T22-31-22.257251.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T22-31-22.257251.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T22-31-22.257251.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-01T22-31-22.257251.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T22-31-22.257251.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-01T22-31-22.257251.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T22-31-22.257251.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T22-31-22.257251.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T22-31-22.257251.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-01T22-31-22.257251.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-01T22-31-22.257251.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T22-31-22.257251.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T22-31-22.257251.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T22-31-22.257251.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T22-31-22.257251.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-01T22-31-22.257251.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-01T22-31-22.257251.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-01T22-31-22.257251.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T22-31-22.257251.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-01T22-31-22.257251.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T22-31-22.257251.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T22-31-22.257251.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-01T22-31-22.257251.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-01T22-31-22.257251.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-01T22-31-22.257251.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T22-31-22.257251.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-01T22-31-22.257251.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-01T22-31-22.257251.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T22-31-22.257251.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-01T22-31-22.257251.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-01T22-31-22.257251.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T22-31-22.257251.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T22-31-22.257251.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-01T22-31-22.257251.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T22-31-22.257251.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T22-31-22.257251.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T22-31-22.257251.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T22-31-22.257251.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-01T22-31-22.257251.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-01T22-31-22.257251.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T22-31-22.257251.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-01T22-31-22.257251.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T22-31-22.257251.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T22-31-22.257251.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T22-31-22.257251.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-01T22-31-22.257251.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T22-31-22.257251.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T22-31-22.257251.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T22-31-22.257251.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T22-31-22.257251.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T22-31-22.257251.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T22-31-22.257251.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T22-31-22.257251.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T22-31-22.257251.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T22-31-22.257251.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T22-31-22.257251.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T22-31-22.257251.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T22-31-22.257251.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T22-31-22.257251.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T22-31-22.257251.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-01T22-31-22.257251.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T22-31-22.257251.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-01T22-31-22.257251.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T22-31-22.257251.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T22-31-22.257251.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T22-31-22.257251.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-01T22-31-22.257251.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-01T22-31-22.257251.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T22-31-22.257251.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T22-31-22.257251.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T22-31-22.257251.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T22-31-22.257251.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-01T22-31-22.257251.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-01T22-31-22.257251.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-01T22-31-22.257251.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T22-31-22.257251.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-01T22-31-22.257251.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T22-31-22.257251.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T22-31-22.257251.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-01T22-31-22.257251.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-01T22-31-22.257251.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-01T22-31-22.257251.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T22-31-22.257251.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-01T22-31-22.257251.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-01T22-31-22.257251.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_01T22_31_22.257251", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T22-31-22.257251.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T22-31-22.257251.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_01T22_31_22.257251", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-01T22-31-22.257251.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-01T22-31-22.257251.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_01T22_31_22.257251", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-01T22-31-22.257251.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-01T22-31-22.257251.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_01T22_31_22.257251", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T22-31-22.257251.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T22-31-22.257251.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_01T22_31_22.257251", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T22-31-22.257251.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T22-31-22.257251.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_01T22_31_22.257251", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-01T22-31-22.257251.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-01T22-31-22.257251.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_01T22_31_22.257251", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T22-31-22.257251.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T22-31-22.257251.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_01T22_31_22.257251", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T22-31-22.257251.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T22-31-22.257251.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_01T22_31_22.257251", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T22-31-22.257251.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T22-31-22.257251.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_01T22_31_22.257251", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T22-31-22.257251.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T22-31-22.257251.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_01T22_31_22.257251", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-01T22-31-22.257251.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-01T22-31-22.257251.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_01T22_31_22.257251", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-01T22-31-22.257251.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-01T22-31-22.257251.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_01T22_31_22.257251", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T22-31-22.257251.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T22-31-22.257251.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_01T22_31_22.257251", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-01T22-31-22.257251.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-01T22-31-22.257251.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_01T22_31_22.257251", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T22-31-22.257251.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T22-31-22.257251.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_01T22_31_22.257251", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T22-31-22.257251.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T22-31-22.257251.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_01T22_31_22.257251", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T22-31-22.257251.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T22-31-22.257251.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_01T22_31_22.257251", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-01T22-31-22.257251.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-01T22-31-22.257251.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_01T22_31_22.257251", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T22-31-22.257251.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T22-31-22.257251.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_01T22_31_22.257251", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T22-31-22.257251.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T22-31-22.257251.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_01T22_31_22.257251", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T22-31-22.257251.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T22-31-22.257251.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_01T22_31_22.257251", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T22-31-22.257251.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T22-31-22.257251.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_01T22_31_22.257251", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T22-31-22.257251.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T22-31-22.257251.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_01T22_31_22.257251", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T22-31-22.257251.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T22-31-22.257251.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_01T22_31_22.257251", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T22-31-22.257251.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T22-31-22.257251.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_01T22_31_22.257251", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T22-31-22.257251.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T22-31-22.257251.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_01T22_31_22.257251", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T22-31-22.257251.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T22-31-22.257251.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_01T22_31_22.257251", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T22-31-22.257251.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T22-31-22.257251.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_01T22_31_22.257251", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T22-31-22.257251.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T22-31-22.257251.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_01T22_31_22.257251", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T22-31-22.257251.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T22-31-22.257251.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_01T22_31_22.257251", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T22-31-22.257251.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T22-31-22.257251.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_01T22_31_22.257251", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T22-31-22.257251.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T22-31-22.257251.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_01T22_31_22.257251", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-01T22-31-22.257251.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-01T22-31-22.257251.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_01T22_31_22.257251", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T22-31-22.257251.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T22-31-22.257251.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_01T22_31_22.257251", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-01T22-31-22.257251.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-01T22-31-22.257251.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_01T22_31_22.257251", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T22-31-22.257251.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T22-31-22.257251.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_01T22_31_22.257251", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T22-31-22.257251.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T22-31-22.257251.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_01T22_31_22.257251", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T22-31-22.257251.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T22-31-22.257251.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_01T22_31_22.257251", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-01T22-31-22.257251.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-01T22-31-22.257251.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_01T22_31_22.257251", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-01T22-31-22.257251.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-01T22-31-22.257251.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_01T22_31_22.257251", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T22-31-22.257251.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T22-31-22.257251.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_01T22_31_22.257251", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T22-31-22.257251.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T22-31-22.257251.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_01T22_31_22.257251", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T22-31-22.257251.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T22-31-22.257251.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_01T22_31_22.257251", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T22-31-22.257251.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T22-31-22.257251.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_01T22_31_22.257251", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-01T22-31-22.257251.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-01T22-31-22.257251.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_01T22_31_22.257251", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-01T22-31-22.257251.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-01T22-31-22.257251.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_01T22_31_22.257251", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-01T22-31-22.257251.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-01T22-31-22.257251.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_01T22_31_22.257251", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T22-31-22.257251.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T22-31-22.257251.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_01T22_31_22.257251", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-01T22-31-22.257251.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-01T22-31-22.257251.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_01T22_31_22.257251", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T22-31-22.257251.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T22-31-22.257251.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_01T22_31_22.257251", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T22-31-22.257251.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T22-31-22.257251.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_01T22_31_22.257251", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-01T22-31-22.257251.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-01T22-31-22.257251.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_01T22_31_22.257251", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-01T22-31-22.257251.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-01T22-31-22.257251.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_01T22_31_22.257251", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-01T22-31-22.257251.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-01T22-31-22.257251.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_01T22_31_22.257251", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T22-31-22.257251.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T22-31-22.257251.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_01T22_31_22.257251", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-01T22-31-22.257251.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-01T22-31-22.257251.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_01T22_31_22.257251", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-01T22-31-22.257251.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-01T22-31-22.257251.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_01T22_31_22.257251", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-01T22-31-22.257251.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-01T22-31-22.257251.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_01T22_31_22.257251", "path": ["**/details_harness|winogrande|5_2024-02-01T22-31-22.257251.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-01T22-31-22.257251.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_01T22_31_22.257251", "path": ["results_2024-02-01T22-31-22.257251.parquet"]}, {"split": "latest", "path": ["results_2024-02-01T22-31-22.257251.parquet"]}]}]}
2024-02-01T22:33:35+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of gmonsoon/TinyUltra-4x1.1B-Base-Alpha Dataset automatically created during the evaluation run of model gmonsoon/TinyUltra-4x1.1B-Base-Alpha on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-02-01T22:31:22.257251(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of gmonsoon/TinyUltra-4x1.1B-Base-Alpha\n\n\n\nDataset automatically created during the evaluation run of model gmonsoon/TinyUltra-4x1.1B-Base-Alpha on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-02-01T22:31:22.257251(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of gmonsoon/TinyUltra-4x1.1B-Base-Alpha\n\n\n\nDataset automatically created during the evaluation run of model gmonsoon/TinyUltra-4x1.1B-Base-Alpha on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-02-01T22:31:22.257251(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
f73551272e804dbcd238f6328174644be5739db4
# Dataset Card for Evaluation run of daekeun-ml/phi-2-upscaled-4B-instruct-v0.1 <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [daekeun-ml/phi-2-upscaled-4B-instruct-v0.1](https://huggingface.co/daekeun-ml/phi-2-upscaled-4B-instruct-v0.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_daekeun-ml__phi-2-upscaled-4B-instruct-v0.1", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-02-01T22:44:33.417264](https://huggingface.co/datasets/open-llm-leaderboard/details_daekeun-ml__phi-2-upscaled-4B-instruct-v0.1/blob/main/results_2024-02-01T22-44-33.417264.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.2664182086042196, "acc_stderr": 0.03128712928450096, "acc_norm": 0.26770166649787397, "acc_norm_stderr": 0.032086763136653096, "mc1": 0.25703794369645044, "mc1_stderr": 0.01529807750948508, "mc2": 0.4092484911285778, "mc2_stderr": 0.01565770896423211 }, "harness|arc:challenge|25": { "acc": 0.1825938566552901, "acc_stderr": 0.011289730684565003, "acc_norm": 0.2295221843003413, "acc_norm_stderr": 0.012288926760890792 }, "harness|hellaswag|10": { "acc": 0.2773351921927903, "acc_stderr": 0.004467684132772413, "acc_norm": 0.286795459071898, "acc_norm_stderr": 0.0045134091149838405 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.29, "acc_stderr": 0.045604802157206845, "acc_norm": 0.29, "acc_norm_stderr": 0.045604802157206845 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.3037037037037037, "acc_stderr": 0.039725528847851375, "acc_norm": 0.3037037037037037, "acc_norm_stderr": 0.039725528847851375 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.27631578947368424, "acc_stderr": 0.03639057569952925, "acc_norm": 0.27631578947368424, "acc_norm_stderr": 0.03639057569952925 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.24, "acc_stderr": 0.04292346959909283, "acc_norm": 0.24, "acc_norm_stderr": 0.04292346959909283 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.29056603773584905, "acc_stderr": 0.027943219989337156, "acc_norm": 0.29056603773584905, "acc_norm_stderr": 0.027943219989337156 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.24305555555555555, "acc_stderr": 0.03586879280080341, "acc_norm": 0.24305555555555555, "acc_norm_stderr": 0.03586879280080341 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.36, "acc_stderr": 0.04824181513244218, "acc_norm": 0.36, "acc_norm_stderr": 0.04824181513244218 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.34, "acc_stderr": 0.04760952285695235, "acc_norm": 0.34, "acc_norm_stderr": 0.04760952285695235 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.28, "acc_stderr": 0.04512608598542127, "acc_norm": 0.28, "acc_norm_stderr": 0.04512608598542127 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.2254335260115607, "acc_stderr": 0.03186209851641144, "acc_norm": 0.2254335260115607, "acc_norm_stderr": 0.03186209851641144 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.29411764705882354, "acc_stderr": 0.04533838195929775, "acc_norm": 0.29411764705882354, "acc_norm_stderr": 0.04533838195929775 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.19, "acc_stderr": 0.039427724440366234, "acc_norm": 0.19, "acc_norm_stderr": 0.039427724440366234 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.23829787234042554, "acc_stderr": 0.02785125297388978, "acc_norm": 0.23829787234042554, "acc_norm_stderr": 0.02785125297388978 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.21929824561403508, "acc_stderr": 0.03892431106518753, "acc_norm": 0.21929824561403508, "acc_norm_stderr": 0.03892431106518753 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.2620689655172414, "acc_stderr": 0.036646663372252565, "acc_norm": 0.2620689655172414, "acc_norm_stderr": 0.036646663372252565 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.25925925925925924, "acc_stderr": 0.022569897074918417, "acc_norm": 0.25925925925925924, "acc_norm_stderr": 0.022569897074918417 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.1984126984126984, "acc_stderr": 0.03567016675276864, "acc_norm": 0.1984126984126984, "acc_norm_stderr": 0.03567016675276864 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.3, "acc_stderr": 0.046056618647183814, "acc_norm": 0.3, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.3161290322580645, "acc_stderr": 0.02645087448904277, "acc_norm": 0.3161290322580645, "acc_norm_stderr": 0.02645087448904277 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.270935960591133, "acc_stderr": 0.031270907132976984, "acc_norm": 0.270935960591133, "acc_norm_stderr": 0.031270907132976984 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.26, "acc_stderr": 0.04408440022768079, "acc_norm": 0.26, "acc_norm_stderr": 0.04408440022768079 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.32727272727272727, "acc_stderr": 0.03663974994391242, "acc_norm": 0.32727272727272727, "acc_norm_stderr": 0.03663974994391242 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.29292929292929293, "acc_stderr": 0.032424979581788166, "acc_norm": 0.29292929292929293, "acc_norm_stderr": 0.032424979581788166 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.3471502590673575, "acc_stderr": 0.034356961683613546, "acc_norm": 0.3471502590673575, "acc_norm_stderr": 0.034356961683613546 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.30256410256410254, "acc_stderr": 0.02329088805377272, "acc_norm": 0.30256410256410254, "acc_norm_stderr": 0.02329088805377272 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.26296296296296295, "acc_stderr": 0.026842057873833706, "acc_norm": 0.26296296296296295, "acc_norm_stderr": 0.026842057873833706 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.29831932773109243, "acc_stderr": 0.029719142876342853, "acc_norm": 0.29831932773109243, "acc_norm_stderr": 0.029719142876342853 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.2847682119205298, "acc_stderr": 0.03684881521389023, "acc_norm": 0.2847682119205298, "acc_norm_stderr": 0.03684881521389023 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.3284403669724771, "acc_stderr": 0.020135902797298395, "acc_norm": 0.3284403669724771, "acc_norm_stderr": 0.020135902797298395 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.4675925925925926, "acc_stderr": 0.03402801581358966, "acc_norm": 0.4675925925925926, "acc_norm_stderr": 0.03402801581358966 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.24019607843137256, "acc_stderr": 0.02998373305591362, "acc_norm": 0.24019607843137256, "acc_norm_stderr": 0.02998373305591362 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.270042194092827, "acc_stderr": 0.028900721906293426, "acc_norm": 0.270042194092827, "acc_norm_stderr": 0.028900721906293426 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.17040358744394618, "acc_stderr": 0.025234593447136175, "acc_norm": 0.17040358744394618, "acc_norm_stderr": 0.025234593447136175 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.2748091603053435, "acc_stderr": 0.03915345408847836, "acc_norm": 0.2748091603053435, "acc_norm_stderr": 0.03915345408847836 }, "harness|hendrycksTest-international_law|5": { "acc": 0.35537190082644626, "acc_stderr": 0.04369236326573981, "acc_norm": 0.35537190082644626, "acc_norm_stderr": 0.04369236326573981 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.23148148148148148, "acc_stderr": 0.04077494709252628, "acc_norm": 0.23148148148148148, "acc_norm_stderr": 0.04077494709252628 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.2392638036809816, "acc_stderr": 0.033519538795212696, "acc_norm": 0.2392638036809816, "acc_norm_stderr": 0.033519538795212696 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.1875, "acc_stderr": 0.0370468111477387, "acc_norm": 0.1875, "acc_norm_stderr": 0.0370468111477387 }, "harness|hendrycksTest-management|5": { "acc": 0.2524271844660194, "acc_stderr": 0.04301250399690878, "acc_norm": 0.2524271844660194, "acc_norm_stderr": 0.04301250399690878 }, "harness|hendrycksTest-marketing|5": { "acc": 0.20512820512820512, "acc_stderr": 0.02645350805404035, "acc_norm": 0.20512820512820512, "acc_norm_stderr": 0.02645350805404035 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.34, "acc_stderr": 0.04760952285695235, "acc_norm": 0.34, "acc_norm_stderr": 0.04760952285695235 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.26181353767560667, "acc_stderr": 0.015720838678445266, "acc_norm": 0.26181353767560667, "acc_norm_stderr": 0.015720838678445266 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.20809248554913296, "acc_stderr": 0.02185525526342179, "acc_norm": 0.20809248554913296, "acc_norm_stderr": 0.02185525526342179 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.24581005586592178, "acc_stderr": 0.01440029642922562, "acc_norm": 0.24581005586592178, "acc_norm_stderr": 0.01440029642922562 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.23529411764705882, "acc_stderr": 0.024288619466046102, "acc_norm": 0.23529411764705882, "acc_norm_stderr": 0.024288619466046102 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.2540192926045016, "acc_stderr": 0.024723861504771696, "acc_norm": 0.2540192926045016, "acc_norm_stderr": 0.024723861504771696 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.2191358024691358, "acc_stderr": 0.023016705640262196, "acc_norm": 0.2191358024691358, "acc_norm_stderr": 0.023016705640262196 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.29432624113475175, "acc_stderr": 0.027187127011503793, "acc_norm": 0.29432624113475175, "acc_norm_stderr": 0.027187127011503793 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.2301173402868318, "acc_stderr": 0.010750183177375557, "acc_norm": 0.2301173402868318, "acc_norm_stderr": 0.010750183177375557 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.4264705882352941, "acc_stderr": 0.030042615832714864, "acc_norm": 0.4264705882352941, "acc_norm_stderr": 0.030042615832714864 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.2238562091503268, "acc_stderr": 0.016863008585416613, "acc_norm": 0.2238562091503268, "acc_norm_stderr": 0.016863008585416613 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.20909090909090908, "acc_stderr": 0.03895091015724137, "acc_norm": 0.20909090909090908, "acc_norm_stderr": 0.03895091015724137 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.24081632653061225, "acc_stderr": 0.027372942201788167, "acc_norm": 0.24081632653061225, "acc_norm_stderr": 0.027372942201788167 }, "harness|hendrycksTest-sociology|5": { "acc": 0.21393034825870647, "acc_stderr": 0.028996909693328916, "acc_norm": 0.21393034825870647, "acc_norm_stderr": 0.028996909693328916 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.24, "acc_stderr": 0.042923469599092816, "acc_norm": 0.24, "acc_norm_stderr": 0.042923469599092816 }, "harness|hendrycksTest-virology|5": { "acc": 0.21686746987951808, "acc_stderr": 0.032082844503563655, "acc_norm": 0.21686746987951808, "acc_norm_stderr": 0.032082844503563655 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.2222222222222222, "acc_stderr": 0.031885780176863984, "acc_norm": 0.2222222222222222, "acc_norm_stderr": 0.031885780176863984 }, "harness|truthfulqa:mc|0": { "mc1": 0.25703794369645044, "mc1_stderr": 0.01529807750948508, "mc2": 0.4092484911285778, "mc2_stderr": 0.01565770896423211 }, "harness|winogrande|5": { "acc": 0.5059194948697711, "acc_stderr": 0.014051500838485807 }, "harness|gsm8k|5": { "acc": 0.0075815011372251705, "acc_stderr": 0.002389281512077233 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_daekeun-ml__phi-2-upscaled-4B-instruct-v0.1
[ "region:us" ]
2024-02-01T22:46:14+00:00
{"pretty_name": "Evaluation run of daekeun-ml/phi-2-upscaled-4B-instruct-v0.1", "dataset_summary": "Dataset automatically created during the evaluation run of model [daekeun-ml/phi-2-upscaled-4B-instruct-v0.1](https://huggingface.co/daekeun-ml/phi-2-upscaled-4B-instruct-v0.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_daekeun-ml__phi-2-upscaled-4B-instruct-v0.1\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-01T22:44:33.417264](https://huggingface.co/datasets/open-llm-leaderboard/details_daekeun-ml__phi-2-upscaled-4B-instruct-v0.1/blob/main/results_2024-02-01T22-44-33.417264.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2664182086042196,\n \"acc_stderr\": 0.03128712928450096,\n \"acc_norm\": 0.26770166649787397,\n \"acc_norm_stderr\": 0.032086763136653096,\n \"mc1\": 0.25703794369645044,\n \"mc1_stderr\": 0.01529807750948508,\n \"mc2\": 0.4092484911285778,\n \"mc2_stderr\": 0.01565770896423211\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.1825938566552901,\n \"acc_stderr\": 0.011289730684565003,\n \"acc_norm\": 0.2295221843003413,\n \"acc_norm_stderr\": 0.012288926760890792\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.2773351921927903,\n \"acc_stderr\": 0.004467684132772413,\n \"acc_norm\": 0.286795459071898,\n \"acc_norm_stderr\": 0.0045134091149838405\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.3037037037037037,\n \"acc_stderr\": 0.039725528847851375,\n \"acc_norm\": 0.3037037037037037,\n \"acc_norm_stderr\": 0.039725528847851375\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.27631578947368424,\n \"acc_stderr\": 0.03639057569952925,\n \"acc_norm\": 0.27631578947368424,\n \"acc_norm_stderr\": 0.03639057569952925\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909283,\n \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909283\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.29056603773584905,\n \"acc_stderr\": 0.027943219989337156,\n \"acc_norm\": 0.29056603773584905,\n \"acc_norm_stderr\": 0.027943219989337156\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.24305555555555555,\n \"acc_stderr\": 0.03586879280080341,\n \"acc_norm\": 0.24305555555555555,\n \"acc_norm_stderr\": 0.03586879280080341\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.2254335260115607,\n \"acc_stderr\": 0.03186209851641144,\n \"acc_norm\": 0.2254335260115607,\n \"acc_norm_stderr\": 0.03186209851641144\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.29411764705882354,\n \"acc_stderr\": 0.04533838195929775,\n \"acc_norm\": 0.29411764705882354,\n \"acc_norm_stderr\": 0.04533838195929775\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.19,\n \"acc_stderr\": 0.039427724440366234,\n \"acc_norm\": 0.19,\n \"acc_norm_stderr\": 0.039427724440366234\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.23829787234042554,\n \"acc_stderr\": 0.02785125297388978,\n \"acc_norm\": 0.23829787234042554,\n \"acc_norm_stderr\": 0.02785125297388978\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.21929824561403508,\n \"acc_stderr\": 0.03892431106518753,\n \"acc_norm\": 0.21929824561403508,\n \"acc_norm_stderr\": 0.03892431106518753\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.2620689655172414,\n \"acc_stderr\": 0.036646663372252565,\n \"acc_norm\": 0.2620689655172414,\n \"acc_norm_stderr\": 0.036646663372252565\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.25925925925925924,\n \"acc_stderr\": 0.022569897074918417,\n \"acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.022569897074918417\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.1984126984126984,\n \"acc_stderr\": 0.03567016675276864,\n \"acc_norm\": 0.1984126984126984,\n \"acc_norm_stderr\": 0.03567016675276864\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.3161290322580645,\n \"acc_stderr\": 0.02645087448904277,\n \"acc_norm\": 0.3161290322580645,\n \"acc_norm_stderr\": 0.02645087448904277\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.270935960591133,\n \"acc_stderr\": 0.031270907132976984,\n \"acc_norm\": 0.270935960591133,\n \"acc_norm_stderr\": 0.031270907132976984\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768079,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768079\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.32727272727272727,\n \"acc_stderr\": 0.03663974994391242,\n \"acc_norm\": 0.32727272727272727,\n \"acc_norm_stderr\": 0.03663974994391242\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.29292929292929293,\n \"acc_stderr\": 0.032424979581788166,\n \"acc_norm\": 0.29292929292929293,\n \"acc_norm_stderr\": 0.032424979581788166\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.3471502590673575,\n \"acc_stderr\": 0.034356961683613546,\n \"acc_norm\": 0.3471502590673575,\n \"acc_norm_stderr\": 0.034356961683613546\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.30256410256410254,\n \"acc_stderr\": 0.02329088805377272,\n \"acc_norm\": 0.30256410256410254,\n \"acc_norm_stderr\": 0.02329088805377272\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.26296296296296295,\n \"acc_stderr\": 0.026842057873833706,\n \"acc_norm\": 0.26296296296296295,\n \"acc_norm_stderr\": 0.026842057873833706\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.29831932773109243,\n \"acc_stderr\": 0.029719142876342853,\n \"acc_norm\": 0.29831932773109243,\n \"acc_norm_stderr\": 0.029719142876342853\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.2847682119205298,\n \"acc_stderr\": 0.03684881521389023,\n \"acc_norm\": 0.2847682119205298,\n \"acc_norm_stderr\": 0.03684881521389023\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.3284403669724771,\n \"acc_stderr\": 0.020135902797298395,\n \"acc_norm\": 0.3284403669724771,\n \"acc_norm_stderr\": 0.020135902797298395\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4675925925925926,\n \"acc_stderr\": 0.03402801581358966,\n \"acc_norm\": 0.4675925925925926,\n \"acc_norm_stderr\": 0.03402801581358966\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.24019607843137256,\n \"acc_stderr\": 0.02998373305591362,\n \"acc_norm\": 0.24019607843137256,\n \"acc_norm_stderr\": 0.02998373305591362\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.270042194092827,\n \"acc_stderr\": 0.028900721906293426,\n \"acc_norm\": 0.270042194092827,\n \"acc_norm_stderr\": 0.028900721906293426\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.17040358744394618,\n \"acc_stderr\": 0.025234593447136175,\n \"acc_norm\": 0.17040358744394618,\n \"acc_norm_stderr\": 0.025234593447136175\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.2748091603053435,\n \"acc_stderr\": 0.03915345408847836,\n \"acc_norm\": 0.2748091603053435,\n \"acc_norm_stderr\": 0.03915345408847836\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.35537190082644626,\n \"acc_stderr\": 0.04369236326573981,\n \"acc_norm\": 0.35537190082644626,\n \"acc_norm_stderr\": 0.04369236326573981\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.23148148148148148,\n \"acc_stderr\": 0.04077494709252628,\n \"acc_norm\": 0.23148148148148148,\n \"acc_norm_stderr\": 0.04077494709252628\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.2392638036809816,\n \"acc_stderr\": 0.033519538795212696,\n \"acc_norm\": 0.2392638036809816,\n \"acc_norm_stderr\": 0.033519538795212696\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.1875,\n \"acc_stderr\": 0.0370468111477387,\n \"acc_norm\": 0.1875,\n \"acc_norm_stderr\": 0.0370468111477387\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.2524271844660194,\n \"acc_stderr\": 0.04301250399690878,\n \"acc_norm\": 0.2524271844660194,\n \"acc_norm_stderr\": 0.04301250399690878\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.20512820512820512,\n \"acc_stderr\": 0.02645350805404035,\n \"acc_norm\": 0.20512820512820512,\n \"acc_norm_stderr\": 0.02645350805404035\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.26181353767560667,\n \"acc_stderr\": 0.015720838678445266,\n \"acc_norm\": 0.26181353767560667,\n \"acc_norm_stderr\": 0.015720838678445266\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.20809248554913296,\n \"acc_stderr\": 0.02185525526342179,\n \"acc_norm\": 0.20809248554913296,\n \"acc_norm_stderr\": 0.02185525526342179\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24581005586592178,\n \"acc_stderr\": 0.01440029642922562,\n \"acc_norm\": 0.24581005586592178,\n \"acc_norm_stderr\": 0.01440029642922562\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.23529411764705882,\n \"acc_stderr\": 0.024288619466046102,\n \"acc_norm\": 0.23529411764705882,\n \"acc_norm_stderr\": 0.024288619466046102\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2540192926045016,\n \"acc_stderr\": 0.024723861504771696,\n \"acc_norm\": 0.2540192926045016,\n \"acc_norm_stderr\": 0.024723861504771696\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.2191358024691358,\n \"acc_stderr\": 0.023016705640262196,\n \"acc_norm\": 0.2191358024691358,\n \"acc_norm_stderr\": 0.023016705640262196\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.29432624113475175,\n \"acc_stderr\": 0.027187127011503793,\n \"acc_norm\": 0.29432624113475175,\n \"acc_norm_stderr\": 0.027187127011503793\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2301173402868318,\n \"acc_stderr\": 0.010750183177375557,\n \"acc_norm\": 0.2301173402868318,\n \"acc_norm_stderr\": 0.010750183177375557\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.4264705882352941,\n \"acc_stderr\": 0.030042615832714864,\n \"acc_norm\": 0.4264705882352941,\n \"acc_norm_stderr\": 0.030042615832714864\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.2238562091503268,\n \"acc_stderr\": 0.016863008585416613,\n \"acc_norm\": 0.2238562091503268,\n \"acc_norm_stderr\": 0.016863008585416613\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.20909090909090908,\n \"acc_stderr\": 0.03895091015724137,\n \"acc_norm\": 0.20909090909090908,\n \"acc_norm_stderr\": 0.03895091015724137\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.24081632653061225,\n \"acc_stderr\": 0.027372942201788167,\n \"acc_norm\": 0.24081632653061225,\n \"acc_norm_stderr\": 0.027372942201788167\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.21393034825870647,\n \"acc_stderr\": 0.028996909693328916,\n \"acc_norm\": 0.21393034825870647,\n \"acc_norm_stderr\": 0.028996909693328916\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.24,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.21686746987951808,\n \"acc_stderr\": 0.032082844503563655,\n \"acc_norm\": 0.21686746987951808,\n \"acc_norm_stderr\": 0.032082844503563655\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.2222222222222222,\n \"acc_stderr\": 0.031885780176863984,\n \"acc_norm\": 0.2222222222222222,\n \"acc_norm_stderr\": 0.031885780176863984\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.25703794369645044,\n \"mc1_stderr\": 0.01529807750948508,\n \"mc2\": 0.4092484911285778,\n \"mc2_stderr\": 0.01565770896423211\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5059194948697711,\n \"acc_stderr\": 0.014051500838485807\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0075815011372251705,\n \"acc_stderr\": 0.002389281512077233\n }\n}\n```", "repo_url": "https://huggingface.co/daekeun-ml/phi-2-upscaled-4B-instruct-v0.1", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_01T22_44_33.417264", "path": ["**/details_harness|arc:challenge|25_2024-02-01T22-44-33.417264.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-01T22-44-33.417264.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_01T22_44_33.417264", "path": ["**/details_harness|gsm8k|5_2024-02-01T22-44-33.417264.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-01T22-44-33.417264.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_01T22_44_33.417264", "path": ["**/details_harness|hellaswag|10_2024-02-01T22-44-33.417264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-01T22-44-33.417264.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_01T22_44_33.417264", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T22-44-33.417264.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-01T22-44-33.417264.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-01T22-44-33.417264.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T22-44-33.417264.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T22-44-33.417264.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-01T22-44-33.417264.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T22-44-33.417264.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T22-44-33.417264.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T22-44-33.417264.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T22-44-33.417264.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-01T22-44-33.417264.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-01T22-44-33.417264.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T22-44-33.417264.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-01T22-44-33.417264.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T22-44-33.417264.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T22-44-33.417264.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T22-44-33.417264.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-01T22-44-33.417264.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T22-44-33.417264.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T22-44-33.417264.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T22-44-33.417264.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T22-44-33.417264.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T22-44-33.417264.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T22-44-33.417264.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T22-44-33.417264.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T22-44-33.417264.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T22-44-33.417264.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T22-44-33.417264.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T22-44-33.417264.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T22-44-33.417264.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T22-44-33.417264.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T22-44-33.417264.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-01T22-44-33.417264.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T22-44-33.417264.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-01T22-44-33.417264.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T22-44-33.417264.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T22-44-33.417264.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T22-44-33.417264.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-01T22-44-33.417264.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-01T22-44-33.417264.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T22-44-33.417264.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T22-44-33.417264.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T22-44-33.417264.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T22-44-33.417264.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-01T22-44-33.417264.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-01T22-44-33.417264.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-01T22-44-33.417264.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T22-44-33.417264.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-01T22-44-33.417264.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T22-44-33.417264.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T22-44-33.417264.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-01T22-44-33.417264.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-01T22-44-33.417264.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-01T22-44-33.417264.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T22-44-33.417264.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-01T22-44-33.417264.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-01T22-44-33.417264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T22-44-33.417264.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-01T22-44-33.417264.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-01T22-44-33.417264.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T22-44-33.417264.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T22-44-33.417264.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-01T22-44-33.417264.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T22-44-33.417264.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T22-44-33.417264.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T22-44-33.417264.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T22-44-33.417264.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-01T22-44-33.417264.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-01T22-44-33.417264.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T22-44-33.417264.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-01T22-44-33.417264.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T22-44-33.417264.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T22-44-33.417264.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T22-44-33.417264.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-01T22-44-33.417264.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T22-44-33.417264.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T22-44-33.417264.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T22-44-33.417264.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T22-44-33.417264.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T22-44-33.417264.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T22-44-33.417264.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T22-44-33.417264.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T22-44-33.417264.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T22-44-33.417264.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T22-44-33.417264.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T22-44-33.417264.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T22-44-33.417264.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T22-44-33.417264.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T22-44-33.417264.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-01T22-44-33.417264.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T22-44-33.417264.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-01T22-44-33.417264.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T22-44-33.417264.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T22-44-33.417264.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T22-44-33.417264.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-01T22-44-33.417264.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-01T22-44-33.417264.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T22-44-33.417264.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T22-44-33.417264.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T22-44-33.417264.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T22-44-33.417264.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-01T22-44-33.417264.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-01T22-44-33.417264.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-01T22-44-33.417264.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T22-44-33.417264.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-01T22-44-33.417264.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T22-44-33.417264.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T22-44-33.417264.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-01T22-44-33.417264.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-01T22-44-33.417264.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-01T22-44-33.417264.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T22-44-33.417264.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-01T22-44-33.417264.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-01T22-44-33.417264.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_01T22_44_33.417264", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T22-44-33.417264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T22-44-33.417264.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_01T22_44_33.417264", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-01T22-44-33.417264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-01T22-44-33.417264.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_01T22_44_33.417264", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-01T22-44-33.417264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-01T22-44-33.417264.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_01T22_44_33.417264", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T22-44-33.417264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T22-44-33.417264.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_01T22_44_33.417264", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T22-44-33.417264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T22-44-33.417264.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_01T22_44_33.417264", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-01T22-44-33.417264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-01T22-44-33.417264.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_01T22_44_33.417264", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T22-44-33.417264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T22-44-33.417264.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_01T22_44_33.417264", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T22-44-33.417264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T22-44-33.417264.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_01T22_44_33.417264", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T22-44-33.417264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T22-44-33.417264.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_01T22_44_33.417264", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T22-44-33.417264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T22-44-33.417264.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_01T22_44_33.417264", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-01T22-44-33.417264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-01T22-44-33.417264.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_01T22_44_33.417264", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-01T22-44-33.417264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-01T22-44-33.417264.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_01T22_44_33.417264", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T22-44-33.417264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T22-44-33.417264.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_01T22_44_33.417264", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-01T22-44-33.417264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-01T22-44-33.417264.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_01T22_44_33.417264", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T22-44-33.417264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T22-44-33.417264.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_01T22_44_33.417264", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T22-44-33.417264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T22-44-33.417264.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_01T22_44_33.417264", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T22-44-33.417264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T22-44-33.417264.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_01T22_44_33.417264", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-01T22-44-33.417264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-01T22-44-33.417264.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_01T22_44_33.417264", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T22-44-33.417264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T22-44-33.417264.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_01T22_44_33.417264", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T22-44-33.417264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T22-44-33.417264.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_01T22_44_33.417264", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T22-44-33.417264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T22-44-33.417264.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_01T22_44_33.417264", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T22-44-33.417264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T22-44-33.417264.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_01T22_44_33.417264", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T22-44-33.417264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T22-44-33.417264.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_01T22_44_33.417264", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T22-44-33.417264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T22-44-33.417264.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_01T22_44_33.417264", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T22-44-33.417264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T22-44-33.417264.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_01T22_44_33.417264", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T22-44-33.417264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T22-44-33.417264.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_01T22_44_33.417264", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T22-44-33.417264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T22-44-33.417264.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_01T22_44_33.417264", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T22-44-33.417264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T22-44-33.417264.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_01T22_44_33.417264", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T22-44-33.417264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T22-44-33.417264.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_01T22_44_33.417264", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T22-44-33.417264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T22-44-33.417264.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_01T22_44_33.417264", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T22-44-33.417264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T22-44-33.417264.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_01T22_44_33.417264", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T22-44-33.417264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T22-44-33.417264.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_01T22_44_33.417264", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-01T22-44-33.417264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-01T22-44-33.417264.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_01T22_44_33.417264", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T22-44-33.417264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T22-44-33.417264.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_01T22_44_33.417264", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-01T22-44-33.417264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-01T22-44-33.417264.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_01T22_44_33.417264", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T22-44-33.417264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T22-44-33.417264.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_01T22_44_33.417264", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T22-44-33.417264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T22-44-33.417264.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_01T22_44_33.417264", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T22-44-33.417264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T22-44-33.417264.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_01T22_44_33.417264", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-01T22-44-33.417264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-01T22-44-33.417264.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_01T22_44_33.417264", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-01T22-44-33.417264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-01T22-44-33.417264.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_01T22_44_33.417264", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T22-44-33.417264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T22-44-33.417264.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_01T22_44_33.417264", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T22-44-33.417264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T22-44-33.417264.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_01T22_44_33.417264", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T22-44-33.417264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T22-44-33.417264.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_01T22_44_33.417264", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T22-44-33.417264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T22-44-33.417264.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_01T22_44_33.417264", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-01T22-44-33.417264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-01T22-44-33.417264.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_01T22_44_33.417264", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-01T22-44-33.417264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-01T22-44-33.417264.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_01T22_44_33.417264", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-01T22-44-33.417264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-01T22-44-33.417264.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_01T22_44_33.417264", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T22-44-33.417264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T22-44-33.417264.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_01T22_44_33.417264", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-01T22-44-33.417264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-01T22-44-33.417264.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_01T22_44_33.417264", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T22-44-33.417264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T22-44-33.417264.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_01T22_44_33.417264", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T22-44-33.417264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T22-44-33.417264.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_01T22_44_33.417264", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-01T22-44-33.417264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-01T22-44-33.417264.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_01T22_44_33.417264", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-01T22-44-33.417264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-01T22-44-33.417264.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_01T22_44_33.417264", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-01T22-44-33.417264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-01T22-44-33.417264.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_01T22_44_33.417264", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T22-44-33.417264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T22-44-33.417264.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_01T22_44_33.417264", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-01T22-44-33.417264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-01T22-44-33.417264.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_01T22_44_33.417264", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-01T22-44-33.417264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-01T22-44-33.417264.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_01T22_44_33.417264", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-01T22-44-33.417264.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-01T22-44-33.417264.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_01T22_44_33.417264", "path": ["**/details_harness|winogrande|5_2024-02-01T22-44-33.417264.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-01T22-44-33.417264.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_01T22_44_33.417264", "path": ["results_2024-02-01T22-44-33.417264.parquet"]}, {"split": "latest", "path": ["results_2024-02-01T22-44-33.417264.parquet"]}]}]}
2024-02-01T22:46:42+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of daekeun-ml/phi-2-upscaled-4B-instruct-v0.1 Dataset automatically created during the evaluation run of model daekeun-ml/phi-2-upscaled-4B-instruct-v0.1 on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-02-01T22:44:33.417264(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of daekeun-ml/phi-2-upscaled-4B-instruct-v0.1\n\n\n\nDataset automatically created during the evaluation run of model daekeun-ml/phi-2-upscaled-4B-instruct-v0.1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-02-01T22:44:33.417264(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of daekeun-ml/phi-2-upscaled-4B-instruct-v0.1\n\n\n\nDataset automatically created during the evaluation run of model daekeun-ml/phi-2-upscaled-4B-instruct-v0.1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-02-01T22:44:33.417264(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
ad950fdda3fe4a7fdf9c9badfe732ff555607118
# Dataset Card for Evaluation run of vanillaOVO/supermario_v3 <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [vanillaOVO/supermario_v3](https://huggingface.co/vanillaOVO/supermario_v3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_vanillaOVO__supermario_v3", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-02-01T22:50:10.610999](https://huggingface.co/datasets/open-llm-leaderboard/details_vanillaOVO__supermario_v3/blob/main/results_2024-02-01T22-50-10.610999.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6567248618325937, "acc_stderr": 0.0320087478662427, "acc_norm": 0.6561978903294995, "acc_norm_stderr": 0.032680203303182616, "mc1": 0.5789473684210527, "mc1_stderr": 0.01728393624813648, "mc2": 0.7200658221296736, "mc2_stderr": 0.014746636228320684 }, "harness|arc:challenge|25": { "acc": 0.7098976109215017, "acc_stderr": 0.013261573677520767, "acc_norm": 0.7380546075085325, "acc_norm_stderr": 0.012849054826858107 }, "harness|hellaswag|10": { "acc": 0.7148974307906791, "acc_stderr": 0.0045054061766068515, "acc_norm": 0.8891655048795061, "acc_norm_stderr": 0.0031328549889236583 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.35, "acc_stderr": 0.0479372485441102, "acc_norm": 0.35, "acc_norm_stderr": 0.0479372485441102 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.674074074074074, "acc_stderr": 0.040491220417025055, "acc_norm": 0.674074074074074, "acc_norm_stderr": 0.040491220417025055 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.6973684210526315, "acc_stderr": 0.03738520676119669, "acc_norm": 0.6973684210526315, "acc_norm_stderr": 0.03738520676119669 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.64, "acc_stderr": 0.04824181513244218, "acc_norm": 0.64, "acc_norm_stderr": 0.04824181513244218 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.7094339622641509, "acc_stderr": 0.027943219989337135, "acc_norm": 0.7094339622641509, "acc_norm_stderr": 0.027943219989337135 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.7708333333333334, "acc_stderr": 0.03514697467862388, "acc_norm": 0.7708333333333334, "acc_norm_stderr": 0.03514697467862388 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.47, "acc_stderr": 0.050161355804659205, "acc_norm": 0.47, "acc_norm_stderr": 0.050161355804659205 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.53, "acc_stderr": 0.050161355804659205, "acc_norm": 0.53, "acc_norm_stderr": 0.050161355804659205 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.32, "acc_stderr": 0.046882617226215034, "acc_norm": 0.32, "acc_norm_stderr": 0.046882617226215034 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6705202312138728, "acc_stderr": 0.03583901754736412, "acc_norm": 0.6705202312138728, "acc_norm_stderr": 0.03583901754736412 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.4019607843137255, "acc_stderr": 0.04878608714466996, "acc_norm": 0.4019607843137255, "acc_norm_stderr": 0.04878608714466996 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.75, "acc_stderr": 0.04351941398892446, "acc_norm": 0.75, "acc_norm_stderr": 0.04351941398892446 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.5531914893617021, "acc_stderr": 0.0325005368436584, "acc_norm": 0.5531914893617021, "acc_norm_stderr": 0.0325005368436584 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.4824561403508772, "acc_stderr": 0.04700708033551038, "acc_norm": 0.4824561403508772, "acc_norm_stderr": 0.04700708033551038 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5310344827586206, "acc_stderr": 0.04158632762097828, "acc_norm": 0.5310344827586206, "acc_norm_stderr": 0.04158632762097828 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.42328042328042326, "acc_stderr": 0.02544636563440678, "acc_norm": 0.42328042328042326, "acc_norm_stderr": 0.02544636563440678 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.5079365079365079, "acc_stderr": 0.044715725362943486, "acc_norm": 0.5079365079365079, "acc_norm_stderr": 0.044715725362943486 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.35, "acc_stderr": 0.047937248544110196, "acc_norm": 0.35, "acc_norm_stderr": 0.047937248544110196 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.7967741935483871, "acc_stderr": 0.02289168798455496, "acc_norm": 0.7967741935483871, "acc_norm_stderr": 0.02289168798455496 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.5123152709359606, "acc_stderr": 0.035169204442208966, "acc_norm": 0.5123152709359606, "acc_norm_stderr": 0.035169204442208966 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.7, "acc_stderr": 0.046056618647183814, "acc_norm": 0.7, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7696969696969697, "acc_stderr": 0.032876667586034906, "acc_norm": 0.7696969696969697, "acc_norm_stderr": 0.032876667586034906 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.8131313131313131, "acc_stderr": 0.027772533334218967, "acc_norm": 0.8131313131313131, "acc_norm_stderr": 0.027772533334218967 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.9067357512953368, "acc_stderr": 0.02098685459328973, "acc_norm": 0.9067357512953368, "acc_norm_stderr": 0.02098685459328973 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.6692307692307692, "acc_stderr": 0.023854795680971128, "acc_norm": 0.6692307692307692, "acc_norm_stderr": 0.023854795680971128 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.34074074074074073, "acc_stderr": 0.028897748741131157, "acc_norm": 0.34074074074074073, "acc_norm_stderr": 0.028897748741131157 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.680672268907563, "acc_stderr": 0.030283995525884396, "acc_norm": 0.680672268907563, "acc_norm_stderr": 0.030283995525884396 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.37748344370860926, "acc_stderr": 0.03958027231121569, "acc_norm": 0.37748344370860926, "acc_norm_stderr": 0.03958027231121569 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8440366972477065, "acc_stderr": 0.01555580271359017, "acc_norm": 0.8440366972477065, "acc_norm_stderr": 0.01555580271359017 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.5138888888888888, "acc_stderr": 0.03408655867977749, "acc_norm": 0.5138888888888888, "acc_norm_stderr": 0.03408655867977749 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.8382352941176471, "acc_stderr": 0.025845017986926917, "acc_norm": 0.8382352941176471, "acc_norm_stderr": 0.025845017986926917 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.8016877637130801, "acc_stderr": 0.025955020841621126, "acc_norm": 0.8016877637130801, "acc_norm_stderr": 0.025955020841621126 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6860986547085202, "acc_stderr": 0.031146796482972465, "acc_norm": 0.6860986547085202, "acc_norm_stderr": 0.031146796482972465 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.8091603053435115, "acc_stderr": 0.03446513350752598, "acc_norm": 0.8091603053435115, "acc_norm_stderr": 0.03446513350752598 }, "harness|hendrycksTest-international_law|5": { "acc": 0.7851239669421488, "acc_stderr": 0.037494924487096966, "acc_norm": 0.7851239669421488, "acc_norm_stderr": 0.037494924487096966 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7777777777777778, "acc_stderr": 0.0401910747255735, "acc_norm": 0.7777777777777778, "acc_norm_stderr": 0.0401910747255735 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7668711656441718, "acc_stderr": 0.0332201579577674, "acc_norm": 0.7668711656441718, "acc_norm_stderr": 0.0332201579577674 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.44642857142857145, "acc_stderr": 0.04718471485219588, "acc_norm": 0.44642857142857145, "acc_norm_stderr": 0.04718471485219588 }, "harness|hendrycksTest-management|5": { "acc": 0.7669902912621359, "acc_stderr": 0.04185832598928315, "acc_norm": 0.7669902912621359, "acc_norm_stderr": 0.04185832598928315 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8931623931623932, "acc_stderr": 0.02023714900899093, "acc_norm": 0.8931623931623932, "acc_norm_stderr": 0.02023714900899093 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.73, "acc_stderr": 0.044619604333847394, "acc_norm": 0.73, "acc_norm_stderr": 0.044619604333847394 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8250319284802043, "acc_stderr": 0.013586619219903343, "acc_norm": 0.8250319284802043, "acc_norm_stderr": 0.013586619219903343 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.7456647398843931, "acc_stderr": 0.023445826276545543, "acc_norm": 0.7456647398843931, "acc_norm_stderr": 0.023445826276545543 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.44692737430167595, "acc_stderr": 0.01662803003964761, "acc_norm": 0.44692737430167595, "acc_norm_stderr": 0.01662803003964761 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.7254901960784313, "acc_stderr": 0.025553169991826524, "acc_norm": 0.7254901960784313, "acc_norm_stderr": 0.025553169991826524 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.7331189710610932, "acc_stderr": 0.025122637608816657, "acc_norm": 0.7331189710610932, "acc_norm_stderr": 0.025122637608816657 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.7469135802469136, "acc_stderr": 0.024191808600712992, "acc_norm": 0.7469135802469136, "acc_norm_stderr": 0.024191808600712992 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.48226950354609927, "acc_stderr": 0.02980873964223777, "acc_norm": 0.48226950354609927, "acc_norm_stderr": 0.02980873964223777 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.46870925684485004, "acc_stderr": 0.012745204626083131, "acc_norm": 0.46870925684485004, "acc_norm_stderr": 0.012745204626083131 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.6801470588235294, "acc_stderr": 0.02833295951403121, "acc_norm": 0.6801470588235294, "acc_norm_stderr": 0.02833295951403121 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.673202614379085, "acc_stderr": 0.018975427920507208, "acc_norm": 0.673202614379085, "acc_norm_stderr": 0.018975427920507208 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6636363636363637, "acc_stderr": 0.04525393596302506, "acc_norm": 0.6636363636363637, "acc_norm_stderr": 0.04525393596302506 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.7346938775510204, "acc_stderr": 0.028263889943784593, "acc_norm": 0.7346938775510204, "acc_norm_stderr": 0.028263889943784593 }, "harness|hendrycksTest-sociology|5": { "acc": 0.845771144278607, "acc_stderr": 0.02553843336857833, "acc_norm": 0.845771144278607, "acc_norm_stderr": 0.02553843336857833 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.85, "acc_stderr": 0.0358870281282637, "acc_norm": 0.85, "acc_norm_stderr": 0.0358870281282637 }, "harness|hendrycksTest-virology|5": { "acc": 0.572289156626506, "acc_stderr": 0.038515976837185335, "acc_norm": 0.572289156626506, "acc_norm_stderr": 0.038515976837185335 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8362573099415205, "acc_stderr": 0.028380919596145866, "acc_norm": 0.8362573099415205, "acc_norm_stderr": 0.028380919596145866 }, "harness|truthfulqa:mc|0": { "mc1": 0.5789473684210527, "mc1_stderr": 0.01728393624813648, "mc2": 0.7200658221296736, "mc2_stderr": 0.014746636228320684 }, "harness|winogrande|5": { "acc": 0.8547750591949487, "acc_stderr": 0.009902153904760824 }, "harness|gsm8k|5": { "acc": 0.6921910538286581, "acc_stderr": 0.012714401009923649 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_vanillaOVO__supermario_v3
[ "region:us" ]
2024-02-01T22:52:31+00:00
{"pretty_name": "Evaluation run of vanillaOVO/supermario_v3", "dataset_summary": "Dataset automatically created during the evaluation run of model [vanillaOVO/supermario_v3](https://huggingface.co/vanillaOVO/supermario_v3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_vanillaOVO__supermario_v3\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-01T22:50:10.610999](https://huggingface.co/datasets/open-llm-leaderboard/details_vanillaOVO__supermario_v3/blob/main/results_2024-02-01T22-50-10.610999.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6567248618325937,\n \"acc_stderr\": 0.0320087478662427,\n \"acc_norm\": 0.6561978903294995,\n \"acc_norm_stderr\": 0.032680203303182616,\n \"mc1\": 0.5789473684210527,\n \"mc1_stderr\": 0.01728393624813648,\n \"mc2\": 0.7200658221296736,\n \"mc2_stderr\": 0.014746636228320684\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.7098976109215017,\n \"acc_stderr\": 0.013261573677520767,\n \"acc_norm\": 0.7380546075085325,\n \"acc_norm_stderr\": 0.012849054826858107\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7148974307906791,\n \"acc_stderr\": 0.0045054061766068515,\n \"acc_norm\": 0.8891655048795061,\n \"acc_norm_stderr\": 0.0031328549889236583\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.674074074074074,\n \"acc_stderr\": 0.040491220417025055,\n \"acc_norm\": 0.674074074074074,\n \"acc_norm_stderr\": 0.040491220417025055\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6973684210526315,\n \"acc_stderr\": 0.03738520676119669,\n \"acc_norm\": 0.6973684210526315,\n \"acc_norm_stderr\": 0.03738520676119669\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7094339622641509,\n \"acc_stderr\": 0.027943219989337135,\n \"acc_norm\": 0.7094339622641509,\n \"acc_norm_stderr\": 0.027943219989337135\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7708333333333334,\n \"acc_stderr\": 0.03514697467862388,\n \"acc_norm\": 0.7708333333333334,\n \"acc_norm_stderr\": 0.03514697467862388\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.53,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6705202312138728,\n \"acc_stderr\": 0.03583901754736412,\n \"acc_norm\": 0.6705202312138728,\n \"acc_norm_stderr\": 0.03583901754736412\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4019607843137255,\n \"acc_stderr\": 0.04878608714466996,\n \"acc_norm\": 0.4019607843137255,\n \"acc_norm_stderr\": 0.04878608714466996\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5531914893617021,\n \"acc_stderr\": 0.0325005368436584,\n \"acc_norm\": 0.5531914893617021,\n \"acc_norm_stderr\": 0.0325005368436584\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.4824561403508772,\n \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5310344827586206,\n \"acc_stderr\": 0.04158632762097828,\n \"acc_norm\": 0.5310344827586206,\n \"acc_norm_stderr\": 0.04158632762097828\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.42328042328042326,\n \"acc_stderr\": 0.02544636563440678,\n \"acc_norm\": 0.42328042328042326,\n \"acc_norm_stderr\": 0.02544636563440678\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5079365079365079,\n \"acc_stderr\": 0.044715725362943486,\n \"acc_norm\": 0.5079365079365079,\n \"acc_norm_stderr\": 0.044715725362943486\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7967741935483871,\n \"acc_stderr\": 0.02289168798455496,\n \"acc_norm\": 0.7967741935483871,\n \"acc_norm_stderr\": 0.02289168798455496\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5123152709359606,\n \"acc_stderr\": 0.035169204442208966,\n \"acc_norm\": 0.5123152709359606,\n \"acc_norm_stderr\": 0.035169204442208966\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.032876667586034906,\n \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.032876667586034906\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.8131313131313131,\n \"acc_stderr\": 0.027772533334218967,\n \"acc_norm\": 0.8131313131313131,\n \"acc_norm_stderr\": 0.027772533334218967\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9067357512953368,\n \"acc_stderr\": 0.02098685459328973,\n \"acc_norm\": 0.9067357512953368,\n \"acc_norm_stderr\": 0.02098685459328973\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6692307692307692,\n \"acc_stderr\": 0.023854795680971128,\n \"acc_norm\": 0.6692307692307692,\n \"acc_norm_stderr\": 0.023854795680971128\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.34074074074074073,\n \"acc_stderr\": 0.028897748741131157,\n \"acc_norm\": 0.34074074074074073,\n \"acc_norm_stderr\": 0.028897748741131157\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.680672268907563,\n \"acc_stderr\": 0.030283995525884396,\n \"acc_norm\": 0.680672268907563,\n \"acc_norm_stderr\": 0.030283995525884396\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.37748344370860926,\n \"acc_stderr\": 0.03958027231121569,\n \"acc_norm\": 0.37748344370860926,\n \"acc_norm_stderr\": 0.03958027231121569\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8440366972477065,\n \"acc_stderr\": 0.01555580271359017,\n \"acc_norm\": 0.8440366972477065,\n \"acc_norm_stderr\": 0.01555580271359017\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5138888888888888,\n \"acc_stderr\": 0.03408655867977749,\n \"acc_norm\": 0.5138888888888888,\n \"acc_norm_stderr\": 0.03408655867977749\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8382352941176471,\n \"acc_stderr\": 0.025845017986926917,\n \"acc_norm\": 0.8382352941176471,\n \"acc_norm_stderr\": 0.025845017986926917\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8016877637130801,\n \"acc_stderr\": 0.025955020841621126,\n \"acc_norm\": 0.8016877637130801,\n \"acc_norm_stderr\": 0.025955020841621126\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8091603053435115,\n \"acc_stderr\": 0.03446513350752598,\n \"acc_norm\": 0.8091603053435115,\n \"acc_norm_stderr\": 0.03446513350752598\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.0401910747255735,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.0401910747255735\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7668711656441718,\n \"acc_stderr\": 0.0332201579577674,\n \"acc_norm\": 0.7668711656441718,\n \"acc_norm_stderr\": 0.0332201579577674\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.44642857142857145,\n \"acc_stderr\": 0.04718471485219588,\n \"acc_norm\": 0.44642857142857145,\n \"acc_norm_stderr\": 0.04718471485219588\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8931623931623932,\n \"acc_stderr\": 0.02023714900899093,\n \"acc_norm\": 0.8931623931623932,\n \"acc_norm_stderr\": 0.02023714900899093\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8250319284802043,\n \"acc_stderr\": 0.013586619219903343,\n \"acc_norm\": 0.8250319284802043,\n \"acc_norm_stderr\": 0.013586619219903343\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7456647398843931,\n \"acc_stderr\": 0.023445826276545543,\n \"acc_norm\": 0.7456647398843931,\n \"acc_norm_stderr\": 0.023445826276545543\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.44692737430167595,\n \"acc_stderr\": 0.01662803003964761,\n \"acc_norm\": 0.44692737430167595,\n \"acc_norm_stderr\": 0.01662803003964761\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7254901960784313,\n \"acc_stderr\": 0.025553169991826524,\n \"acc_norm\": 0.7254901960784313,\n \"acc_norm_stderr\": 0.025553169991826524\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7331189710610932,\n \"acc_stderr\": 0.025122637608816657,\n \"acc_norm\": 0.7331189710610932,\n \"acc_norm_stderr\": 0.025122637608816657\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7469135802469136,\n \"acc_stderr\": 0.024191808600712992,\n \"acc_norm\": 0.7469135802469136,\n \"acc_norm_stderr\": 0.024191808600712992\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.48226950354609927,\n \"acc_stderr\": 0.02980873964223777,\n \"acc_norm\": 0.48226950354609927,\n \"acc_norm_stderr\": 0.02980873964223777\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46870925684485004,\n \"acc_stderr\": 0.012745204626083131,\n \"acc_norm\": 0.46870925684485004,\n \"acc_norm_stderr\": 0.012745204626083131\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6801470588235294,\n \"acc_stderr\": 0.02833295951403121,\n \"acc_norm\": 0.6801470588235294,\n \"acc_norm_stderr\": 0.02833295951403121\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.673202614379085,\n \"acc_stderr\": 0.018975427920507208,\n \"acc_norm\": 0.673202614379085,\n \"acc_norm_stderr\": 0.018975427920507208\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7346938775510204,\n \"acc_stderr\": 0.028263889943784593,\n \"acc_norm\": 0.7346938775510204,\n \"acc_norm_stderr\": 0.028263889943784593\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.845771144278607,\n \"acc_stderr\": 0.02553843336857833,\n \"acc_norm\": 0.845771144278607,\n \"acc_norm_stderr\": 0.02553843336857833\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.85,\n \"acc_stderr\": 0.0358870281282637,\n \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.0358870281282637\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.572289156626506,\n \"acc_stderr\": 0.038515976837185335,\n \"acc_norm\": 0.572289156626506,\n \"acc_norm_stderr\": 0.038515976837185335\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5789473684210527,\n \"mc1_stderr\": 0.01728393624813648,\n \"mc2\": 0.7200658221296736,\n \"mc2_stderr\": 0.014746636228320684\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8547750591949487,\n \"acc_stderr\": 0.009902153904760824\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6921910538286581,\n \"acc_stderr\": 0.012714401009923649\n }\n}\n```", "repo_url": "https://huggingface.co/vanillaOVO/supermario_v3", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_01T22_50_10.610999", "path": ["**/details_harness|arc:challenge|25_2024-02-01T22-50-10.610999.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-01T22-50-10.610999.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_01T22_50_10.610999", "path": ["**/details_harness|gsm8k|5_2024-02-01T22-50-10.610999.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-01T22-50-10.610999.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_01T22_50_10.610999", "path": ["**/details_harness|hellaswag|10_2024-02-01T22-50-10.610999.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-01T22-50-10.610999.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_01T22_50_10.610999", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T22-50-10.610999.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-01T22-50-10.610999.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-01T22-50-10.610999.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T22-50-10.610999.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T22-50-10.610999.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-01T22-50-10.610999.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T22-50-10.610999.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T22-50-10.610999.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T22-50-10.610999.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T22-50-10.610999.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-01T22-50-10.610999.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-01T22-50-10.610999.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T22-50-10.610999.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-01T22-50-10.610999.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T22-50-10.610999.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T22-50-10.610999.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T22-50-10.610999.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-01T22-50-10.610999.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T22-50-10.610999.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T22-50-10.610999.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T22-50-10.610999.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T22-50-10.610999.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T22-50-10.610999.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T22-50-10.610999.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T22-50-10.610999.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T22-50-10.610999.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T22-50-10.610999.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T22-50-10.610999.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T22-50-10.610999.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T22-50-10.610999.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T22-50-10.610999.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T22-50-10.610999.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-01T22-50-10.610999.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T22-50-10.610999.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-01T22-50-10.610999.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T22-50-10.610999.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T22-50-10.610999.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T22-50-10.610999.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-01T22-50-10.610999.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-01T22-50-10.610999.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T22-50-10.610999.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T22-50-10.610999.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T22-50-10.610999.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T22-50-10.610999.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-01T22-50-10.610999.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-01T22-50-10.610999.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-01T22-50-10.610999.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T22-50-10.610999.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-01T22-50-10.610999.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T22-50-10.610999.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T22-50-10.610999.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-01T22-50-10.610999.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-01T22-50-10.610999.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-01T22-50-10.610999.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T22-50-10.610999.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-01T22-50-10.610999.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-01T22-50-10.610999.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T22-50-10.610999.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-01T22-50-10.610999.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-01T22-50-10.610999.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T22-50-10.610999.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T22-50-10.610999.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-01T22-50-10.610999.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T22-50-10.610999.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T22-50-10.610999.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T22-50-10.610999.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T22-50-10.610999.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-01T22-50-10.610999.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-01T22-50-10.610999.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T22-50-10.610999.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-01T22-50-10.610999.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T22-50-10.610999.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T22-50-10.610999.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T22-50-10.610999.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-01T22-50-10.610999.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T22-50-10.610999.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T22-50-10.610999.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T22-50-10.610999.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T22-50-10.610999.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T22-50-10.610999.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T22-50-10.610999.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T22-50-10.610999.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T22-50-10.610999.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T22-50-10.610999.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T22-50-10.610999.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T22-50-10.610999.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T22-50-10.610999.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T22-50-10.610999.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T22-50-10.610999.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-01T22-50-10.610999.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T22-50-10.610999.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-01T22-50-10.610999.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T22-50-10.610999.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T22-50-10.610999.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T22-50-10.610999.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-01T22-50-10.610999.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-01T22-50-10.610999.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T22-50-10.610999.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T22-50-10.610999.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T22-50-10.610999.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T22-50-10.610999.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-01T22-50-10.610999.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-01T22-50-10.610999.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-01T22-50-10.610999.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T22-50-10.610999.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-01T22-50-10.610999.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T22-50-10.610999.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T22-50-10.610999.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-01T22-50-10.610999.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-01T22-50-10.610999.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-01T22-50-10.610999.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T22-50-10.610999.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-01T22-50-10.610999.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-01T22-50-10.610999.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_01T22_50_10.610999", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T22-50-10.610999.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T22-50-10.610999.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_01T22_50_10.610999", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-01T22-50-10.610999.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-01T22-50-10.610999.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_01T22_50_10.610999", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-01T22-50-10.610999.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-01T22-50-10.610999.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_01T22_50_10.610999", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T22-50-10.610999.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T22-50-10.610999.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_01T22_50_10.610999", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T22-50-10.610999.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T22-50-10.610999.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_01T22_50_10.610999", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-01T22-50-10.610999.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-01T22-50-10.610999.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_01T22_50_10.610999", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T22-50-10.610999.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T22-50-10.610999.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_01T22_50_10.610999", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T22-50-10.610999.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T22-50-10.610999.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_01T22_50_10.610999", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T22-50-10.610999.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T22-50-10.610999.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_01T22_50_10.610999", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T22-50-10.610999.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T22-50-10.610999.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_01T22_50_10.610999", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-01T22-50-10.610999.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-01T22-50-10.610999.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_01T22_50_10.610999", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-01T22-50-10.610999.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-01T22-50-10.610999.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_01T22_50_10.610999", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T22-50-10.610999.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T22-50-10.610999.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_01T22_50_10.610999", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-01T22-50-10.610999.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-01T22-50-10.610999.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_01T22_50_10.610999", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T22-50-10.610999.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T22-50-10.610999.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_01T22_50_10.610999", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T22-50-10.610999.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T22-50-10.610999.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_01T22_50_10.610999", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T22-50-10.610999.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T22-50-10.610999.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_01T22_50_10.610999", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-01T22-50-10.610999.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-01T22-50-10.610999.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_01T22_50_10.610999", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T22-50-10.610999.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T22-50-10.610999.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_01T22_50_10.610999", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T22-50-10.610999.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T22-50-10.610999.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_01T22_50_10.610999", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T22-50-10.610999.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T22-50-10.610999.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_01T22_50_10.610999", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T22-50-10.610999.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T22-50-10.610999.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_01T22_50_10.610999", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T22-50-10.610999.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T22-50-10.610999.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_01T22_50_10.610999", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T22-50-10.610999.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T22-50-10.610999.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_01T22_50_10.610999", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T22-50-10.610999.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T22-50-10.610999.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_01T22_50_10.610999", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T22-50-10.610999.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T22-50-10.610999.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_01T22_50_10.610999", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T22-50-10.610999.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T22-50-10.610999.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_01T22_50_10.610999", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T22-50-10.610999.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T22-50-10.610999.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_01T22_50_10.610999", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T22-50-10.610999.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T22-50-10.610999.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_01T22_50_10.610999", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T22-50-10.610999.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T22-50-10.610999.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_01T22_50_10.610999", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T22-50-10.610999.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T22-50-10.610999.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_01T22_50_10.610999", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T22-50-10.610999.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T22-50-10.610999.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_01T22_50_10.610999", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-01T22-50-10.610999.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-01T22-50-10.610999.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_01T22_50_10.610999", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T22-50-10.610999.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T22-50-10.610999.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_01T22_50_10.610999", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-01T22-50-10.610999.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-01T22-50-10.610999.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_01T22_50_10.610999", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T22-50-10.610999.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T22-50-10.610999.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_01T22_50_10.610999", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T22-50-10.610999.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T22-50-10.610999.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_01T22_50_10.610999", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T22-50-10.610999.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T22-50-10.610999.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_01T22_50_10.610999", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-01T22-50-10.610999.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-01T22-50-10.610999.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_01T22_50_10.610999", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-01T22-50-10.610999.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-01T22-50-10.610999.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_01T22_50_10.610999", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T22-50-10.610999.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T22-50-10.610999.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_01T22_50_10.610999", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T22-50-10.610999.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T22-50-10.610999.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_01T22_50_10.610999", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T22-50-10.610999.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T22-50-10.610999.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_01T22_50_10.610999", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T22-50-10.610999.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T22-50-10.610999.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_01T22_50_10.610999", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-01T22-50-10.610999.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-01T22-50-10.610999.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_01T22_50_10.610999", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-01T22-50-10.610999.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-01T22-50-10.610999.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_01T22_50_10.610999", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-01T22-50-10.610999.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-01T22-50-10.610999.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_01T22_50_10.610999", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T22-50-10.610999.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T22-50-10.610999.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_01T22_50_10.610999", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-01T22-50-10.610999.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-01T22-50-10.610999.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_01T22_50_10.610999", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T22-50-10.610999.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T22-50-10.610999.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_01T22_50_10.610999", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T22-50-10.610999.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T22-50-10.610999.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_01T22_50_10.610999", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-01T22-50-10.610999.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-01T22-50-10.610999.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_01T22_50_10.610999", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-01T22-50-10.610999.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-01T22-50-10.610999.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_01T22_50_10.610999", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-01T22-50-10.610999.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-01T22-50-10.610999.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_01T22_50_10.610999", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T22-50-10.610999.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T22-50-10.610999.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_01T22_50_10.610999", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-01T22-50-10.610999.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-01T22-50-10.610999.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_01T22_50_10.610999", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-01T22-50-10.610999.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-01T22-50-10.610999.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_01T22_50_10.610999", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-01T22-50-10.610999.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-01T22-50-10.610999.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_01T22_50_10.610999", "path": ["**/details_harness|winogrande|5_2024-02-01T22-50-10.610999.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-01T22-50-10.610999.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_01T22_50_10.610999", "path": ["results_2024-02-01T22-50-10.610999.parquet"]}, {"split": "latest", "path": ["results_2024-02-01T22-50-10.610999.parquet"]}]}]}
2024-02-01T22:52:55+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of vanillaOVO/supermario_v3 Dataset automatically created during the evaluation run of model vanillaOVO/supermario_v3 on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-02-01T22:50:10.610999(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of vanillaOVO/supermario_v3\n\n\n\nDataset automatically created during the evaluation run of model vanillaOVO/supermario_v3 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-02-01T22:50:10.610999(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of vanillaOVO/supermario_v3\n\n\n\nDataset automatically created during the evaluation run of model vanillaOVO/supermario_v3 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-02-01T22:50:10.610999(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
52a2cc833307bf68c7b0d72f751a0bd2db0192d7
# Dataset Card for Evaluation run of msy127/mnsim-dpo-peftmerged-2-eos <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [msy127/mnsim-dpo-peftmerged-2-eos](https://huggingface.co/msy127/mnsim-dpo-peftmerged-2-eos) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_msy127__mnsim-dpo-peftmerged-2-eos", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-02-01T22:52:39.126509](https://huggingface.co/datasets/open-llm-leaderboard/details_msy127__mnsim-dpo-peftmerged-2-eos/blob/main/results_2024-02-01T22-52-39.126509.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.5126345853384686, "acc_stderr": 0.034203989562530804, "acc_norm": 0.5177735746052927, "acc_norm_stderr": 0.03497190870754416, "mc1": 0.31701346389228885, "mc1_stderr": 0.016289203374403392, "mc2": 0.4637339879535235, "mc2_stderr": 0.014647929379084504 }, "harness|arc:challenge|25": { "acc": 0.5426621160409556, "acc_stderr": 0.014558106543924065, "acc_norm": 0.5563139931740614, "acc_norm_stderr": 0.014518421825670445 }, "harness|hellaswag|10": { "acc": 0.5824536944831706, "acc_stderr": 0.004921466591335048, "acc_norm": 0.77823142800239, "acc_norm_stderr": 0.0041458720916152155 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.3, "acc_stderr": 0.04605661864718381, "acc_norm": 0.3, "acc_norm_stderr": 0.04605661864718381 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.4740740740740741, "acc_stderr": 0.04313531696750574, "acc_norm": 0.4740740740740741, "acc_norm_stderr": 0.04313531696750574 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.5, "acc_stderr": 0.04068942293855797, "acc_norm": 0.5, "acc_norm_stderr": 0.04068942293855797 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.47, "acc_stderr": 0.050161355804659205, "acc_norm": 0.47, "acc_norm_stderr": 0.050161355804659205 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.5962264150943396, "acc_stderr": 0.03019761160019795, "acc_norm": 0.5962264150943396, "acc_norm_stderr": 0.03019761160019795 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.5208333333333334, "acc_stderr": 0.041775789507399935, "acc_norm": 0.5208333333333334, "acc_norm_stderr": 0.041775789507399935 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.37, "acc_stderr": 0.04852365870939099, "acc_norm": 0.37, "acc_norm_stderr": 0.04852365870939099 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.45, "acc_stderr": 0.049999999999999996, "acc_norm": 0.45, "acc_norm_stderr": 0.049999999999999996 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.31, "acc_stderr": 0.04648231987117316, "acc_norm": 0.31, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.5086705202312138, "acc_stderr": 0.03811890988940412, "acc_norm": 0.5086705202312138, "acc_norm_stderr": 0.03811890988940412 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.20588235294117646, "acc_stderr": 0.04023382273617749, "acc_norm": 0.20588235294117646, "acc_norm_stderr": 0.04023382273617749 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.61, "acc_stderr": 0.04902071300001975, "acc_norm": 0.61, "acc_norm_stderr": 0.04902071300001975 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.4, "acc_stderr": 0.03202563076101735, "acc_norm": 0.4, "acc_norm_stderr": 0.03202563076101735 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.2543859649122807, "acc_stderr": 0.040969851398436716, "acc_norm": 0.2543859649122807, "acc_norm_stderr": 0.040969851398436716 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.4413793103448276, "acc_stderr": 0.04137931034482758, "acc_norm": 0.4413793103448276, "acc_norm_stderr": 0.04137931034482758 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.30952380952380953, "acc_stderr": 0.023809523809523874, "acc_norm": 0.30952380952380953, "acc_norm_stderr": 0.023809523809523874 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.3412698412698413, "acc_stderr": 0.04240799327574925, "acc_norm": 0.3412698412698413, "acc_norm_stderr": 0.04240799327574925 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.34, "acc_stderr": 0.04760952285695235, "acc_norm": 0.34, "acc_norm_stderr": 0.04760952285695235 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.5838709677419355, "acc_stderr": 0.028040981380761533, "acc_norm": 0.5838709677419355, "acc_norm_stderr": 0.028040981380761533 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.458128078817734, "acc_stderr": 0.03505630140785741, "acc_norm": 0.458128078817734, "acc_norm_stderr": 0.03505630140785741 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.44, "acc_stderr": 0.04988876515698589, "acc_norm": 0.44, "acc_norm_stderr": 0.04988876515698589 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.6242424242424243, "acc_stderr": 0.03781887353205982, "acc_norm": 0.6242424242424243, "acc_norm_stderr": 0.03781887353205982 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.6868686868686869, "acc_stderr": 0.033042050878136525, "acc_norm": 0.6868686868686869, "acc_norm_stderr": 0.033042050878136525 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.7616580310880829, "acc_stderr": 0.03074890536390988, "acc_norm": 0.7616580310880829, "acc_norm_stderr": 0.03074890536390988 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.5230769230769231, "acc_stderr": 0.025323990861736236, "acc_norm": 0.5230769230769231, "acc_norm_stderr": 0.025323990861736236 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.26296296296296295, "acc_stderr": 0.026842057873833706, "acc_norm": 0.26296296296296295, "acc_norm_stderr": 0.026842057873833706 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.5252100840336135, "acc_stderr": 0.03243718055137411, "acc_norm": 0.5252100840336135, "acc_norm_stderr": 0.03243718055137411 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.3443708609271523, "acc_stderr": 0.03879687024073327, "acc_norm": 0.3443708609271523, "acc_norm_stderr": 0.03879687024073327 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.710091743119266, "acc_stderr": 0.019453066609201597, "acc_norm": 0.710091743119266, "acc_norm_stderr": 0.019453066609201597 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.4027777777777778, "acc_stderr": 0.033448873829978666, "acc_norm": 0.4027777777777778, "acc_norm_stderr": 0.033448873829978666 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.6617647058823529, "acc_stderr": 0.03320574612945431, "acc_norm": 0.6617647058823529, "acc_norm_stderr": 0.03320574612945431 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.70042194092827, "acc_stderr": 0.02981802474975309, "acc_norm": 0.70042194092827, "acc_norm_stderr": 0.02981802474975309 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6053811659192825, "acc_stderr": 0.03280400504755291, "acc_norm": 0.6053811659192825, "acc_norm_stderr": 0.03280400504755291 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.5954198473282443, "acc_stderr": 0.043046937953806645, "acc_norm": 0.5954198473282443, "acc_norm_stderr": 0.043046937953806645 }, "harness|hendrycksTest-international_law|5": { "acc": 0.6942148760330579, "acc_stderr": 0.04205953933884122, "acc_norm": 0.6942148760330579, "acc_norm_stderr": 0.04205953933884122 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.6481481481481481, "acc_stderr": 0.046166311118017125, "acc_norm": 0.6481481481481481, "acc_norm_stderr": 0.046166311118017125 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.5766871165644172, "acc_stderr": 0.03881891213334384, "acc_norm": 0.5766871165644172, "acc_norm_stderr": 0.03881891213334384 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.3125, "acc_stderr": 0.043994650575715215, "acc_norm": 0.3125, "acc_norm_stderr": 0.043994650575715215 }, "harness|hendrycksTest-management|5": { "acc": 0.6116504854368932, "acc_stderr": 0.0482572933735639, "acc_norm": 0.6116504854368932, "acc_norm_stderr": 0.0482572933735639 }, "harness|hendrycksTest-marketing|5": { "acc": 0.7393162393162394, "acc_stderr": 0.028760348956523414, "acc_norm": 0.7393162393162394, "acc_norm_stderr": 0.028760348956523414 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.5, "acc_stderr": 0.050251890762960605, "acc_norm": 0.5, "acc_norm_stderr": 0.050251890762960605 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.6807151979565773, "acc_stderr": 0.01667126174953871, "acc_norm": 0.6807151979565773, "acc_norm_stderr": 0.01667126174953871 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.6098265895953757, "acc_stderr": 0.026261677607806642, "acc_norm": 0.6098265895953757, "acc_norm_stderr": 0.026261677607806642 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.24916201117318434, "acc_stderr": 0.014465893829859936, "acc_norm": 0.24916201117318434, "acc_norm_stderr": 0.014465893829859936 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.545751633986928, "acc_stderr": 0.028509807802626595, "acc_norm": 0.545751633986928, "acc_norm_stderr": 0.028509807802626595 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.6077170418006431, "acc_stderr": 0.027731258647011998, "acc_norm": 0.6077170418006431, "acc_norm_stderr": 0.027731258647011998 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.5802469135802469, "acc_stderr": 0.027460099557005135, "acc_norm": 0.5802469135802469, "acc_norm_stderr": 0.027460099557005135 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.40425531914893614, "acc_stderr": 0.029275532159704725, "acc_norm": 0.40425531914893614, "acc_norm_stderr": 0.029275532159704725 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.43089960886571055, "acc_stderr": 0.012647695889547228, "acc_norm": 0.43089960886571055, "acc_norm_stderr": 0.012647695889547228 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.5367647058823529, "acc_stderr": 0.03029061918048569, "acc_norm": 0.5367647058823529, "acc_norm_stderr": 0.03029061918048569 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.5196078431372549, "acc_stderr": 0.020212274976302957, "acc_norm": 0.5196078431372549, "acc_norm_stderr": 0.020212274976302957 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6272727272727273, "acc_stderr": 0.04631381319425464, "acc_norm": 0.6272727272727273, "acc_norm_stderr": 0.04631381319425464 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.5102040816326531, "acc_stderr": 0.03200255347893783, "acc_norm": 0.5102040816326531, "acc_norm_stderr": 0.03200255347893783 }, "harness|hendrycksTest-sociology|5": { "acc": 0.6965174129353234, "acc_stderr": 0.032510068164586195, "acc_norm": 0.6965174129353234, "acc_norm_stderr": 0.032510068164586195 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.72, "acc_stderr": 0.045126085985421296, "acc_norm": 0.72, "acc_norm_stderr": 0.045126085985421296 }, "harness|hendrycksTest-virology|5": { "acc": 0.4457831325301205, "acc_stderr": 0.03869543323472101, "acc_norm": 0.4457831325301205, "acc_norm_stderr": 0.03869543323472101 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.6783625730994152, "acc_stderr": 0.03582529442573122, "acc_norm": 0.6783625730994152, "acc_norm_stderr": 0.03582529442573122 }, "harness|truthfulqa:mc|0": { "mc1": 0.31701346389228885, "mc1_stderr": 0.016289203374403392, "mc2": 0.4637339879535235, "mc2_stderr": 0.014647929379084504 }, "harness|winogrande|5": { "acc": 0.7624309392265194, "acc_stderr": 0.011961298905803159 }, "harness|gsm8k|5": { "acc": 0.16906747536012132, "acc_stderr": 0.010324171445497347 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_msy127__mnsim-dpo-peftmerged-2-eos
[ "region:us" ]
2024-02-01T22:54:29+00:00
{"pretty_name": "Evaluation run of msy127/mnsim-dpo-peftmerged-2-eos", "dataset_summary": "Dataset automatically created during the evaluation run of model [msy127/mnsim-dpo-peftmerged-2-eos](https://huggingface.co/msy127/mnsim-dpo-peftmerged-2-eos) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_msy127__mnsim-dpo-peftmerged-2-eos\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-01T22:52:39.126509](https://huggingface.co/datasets/open-llm-leaderboard/details_msy127__mnsim-dpo-peftmerged-2-eos/blob/main/results_2024-02-01T22-52-39.126509.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5126345853384686,\n \"acc_stderr\": 0.034203989562530804,\n \"acc_norm\": 0.5177735746052927,\n \"acc_norm_stderr\": 0.03497190870754416,\n \"mc1\": 0.31701346389228885,\n \"mc1_stderr\": 0.016289203374403392,\n \"mc2\": 0.4637339879535235,\n \"mc2_stderr\": 0.014647929379084504\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5426621160409556,\n \"acc_stderr\": 0.014558106543924065,\n \"acc_norm\": 0.5563139931740614,\n \"acc_norm_stderr\": 0.014518421825670445\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5824536944831706,\n \"acc_stderr\": 0.004921466591335048,\n \"acc_norm\": 0.77823142800239,\n \"acc_norm_stderr\": 0.0041458720916152155\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.04605661864718381,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.04605661864718381\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4740740740740741,\n \"acc_stderr\": 0.04313531696750574,\n \"acc_norm\": 0.4740740740740741,\n \"acc_norm_stderr\": 0.04313531696750574\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.04068942293855797,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.04068942293855797\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.5962264150943396,\n \"acc_stderr\": 0.03019761160019795,\n \"acc_norm\": 0.5962264150943396,\n \"acc_norm_stderr\": 0.03019761160019795\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5208333333333334,\n \"acc_stderr\": 0.041775789507399935,\n \"acc_norm\": 0.5208333333333334,\n \"acc_norm_stderr\": 0.041775789507399935\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.45,\n \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5086705202312138,\n \"acc_stderr\": 0.03811890988940412,\n \"acc_norm\": 0.5086705202312138,\n \"acc_norm_stderr\": 0.03811890988940412\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.20588235294117646,\n \"acc_stderr\": 0.04023382273617749,\n \"acc_norm\": 0.20588235294117646,\n \"acc_norm_stderr\": 0.04023382273617749\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.61,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.4,\n \"acc_stderr\": 0.03202563076101735,\n \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.03202563076101735\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2543859649122807,\n \"acc_stderr\": 0.040969851398436716,\n \"acc_norm\": 0.2543859649122807,\n \"acc_norm_stderr\": 0.040969851398436716\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.4413793103448276,\n \"acc_stderr\": 0.04137931034482758,\n \"acc_norm\": 0.4413793103448276,\n \"acc_norm_stderr\": 0.04137931034482758\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.30952380952380953,\n \"acc_stderr\": 0.023809523809523874,\n \"acc_norm\": 0.30952380952380953,\n \"acc_norm_stderr\": 0.023809523809523874\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3412698412698413,\n \"acc_stderr\": 0.04240799327574925,\n \"acc_norm\": 0.3412698412698413,\n \"acc_norm_stderr\": 0.04240799327574925\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.5838709677419355,\n \"acc_stderr\": 0.028040981380761533,\n \"acc_norm\": 0.5838709677419355,\n \"acc_norm_stderr\": 0.028040981380761533\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.458128078817734,\n \"acc_stderr\": 0.03505630140785741,\n \"acc_norm\": 0.458128078817734,\n \"acc_norm_stderr\": 0.03505630140785741\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.6242424242424243,\n \"acc_stderr\": 0.03781887353205982,\n \"acc_norm\": 0.6242424242424243,\n \"acc_norm_stderr\": 0.03781887353205982\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.6868686868686869,\n \"acc_stderr\": 0.033042050878136525,\n \"acc_norm\": 0.6868686868686869,\n \"acc_norm_stderr\": 0.033042050878136525\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.7616580310880829,\n \"acc_stderr\": 0.03074890536390988,\n \"acc_norm\": 0.7616580310880829,\n \"acc_norm_stderr\": 0.03074890536390988\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.5230769230769231,\n \"acc_stderr\": 0.025323990861736236,\n \"acc_norm\": 0.5230769230769231,\n \"acc_norm_stderr\": 0.025323990861736236\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.26296296296296295,\n \"acc_stderr\": 0.026842057873833706,\n \"acc_norm\": 0.26296296296296295,\n \"acc_norm_stderr\": 0.026842057873833706\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.5252100840336135,\n \"acc_stderr\": 0.03243718055137411,\n \"acc_norm\": 0.5252100840336135,\n \"acc_norm_stderr\": 0.03243718055137411\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3443708609271523,\n \"acc_stderr\": 0.03879687024073327,\n \"acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.03879687024073327\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.710091743119266,\n \"acc_stderr\": 0.019453066609201597,\n \"acc_norm\": 0.710091743119266,\n \"acc_norm_stderr\": 0.019453066609201597\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4027777777777778,\n \"acc_stderr\": 0.033448873829978666,\n \"acc_norm\": 0.4027777777777778,\n \"acc_norm_stderr\": 0.033448873829978666\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.6617647058823529,\n \"acc_stderr\": 0.03320574612945431,\n \"acc_norm\": 0.6617647058823529,\n \"acc_norm_stderr\": 0.03320574612945431\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.70042194092827,\n \"acc_stderr\": 0.02981802474975309,\n \"acc_norm\": 0.70042194092827,\n \"acc_norm_stderr\": 0.02981802474975309\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6053811659192825,\n \"acc_stderr\": 0.03280400504755291,\n \"acc_norm\": 0.6053811659192825,\n \"acc_norm_stderr\": 0.03280400504755291\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.5954198473282443,\n \"acc_stderr\": 0.043046937953806645,\n \"acc_norm\": 0.5954198473282443,\n \"acc_norm_stderr\": 0.043046937953806645\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.6942148760330579,\n \"acc_stderr\": 0.04205953933884122,\n \"acc_norm\": 0.6942148760330579,\n \"acc_norm_stderr\": 0.04205953933884122\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6481481481481481,\n \"acc_stderr\": 0.046166311118017125,\n \"acc_norm\": 0.6481481481481481,\n \"acc_norm_stderr\": 0.046166311118017125\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.5766871165644172,\n \"acc_stderr\": 0.03881891213334384,\n \"acc_norm\": 0.5766871165644172,\n \"acc_norm_stderr\": 0.03881891213334384\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3125,\n \"acc_stderr\": 0.043994650575715215,\n \"acc_norm\": 0.3125,\n \"acc_norm_stderr\": 0.043994650575715215\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.6116504854368932,\n \"acc_stderr\": 0.0482572933735639,\n \"acc_norm\": 0.6116504854368932,\n \"acc_norm_stderr\": 0.0482572933735639\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7393162393162394,\n \"acc_stderr\": 0.028760348956523414,\n \"acc_norm\": 0.7393162393162394,\n \"acc_norm_stderr\": 0.028760348956523414\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6807151979565773,\n \"acc_stderr\": 0.01667126174953871,\n \"acc_norm\": 0.6807151979565773,\n \"acc_norm_stderr\": 0.01667126174953871\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6098265895953757,\n \"acc_stderr\": 0.026261677607806642,\n \"acc_norm\": 0.6098265895953757,\n \"acc_norm_stderr\": 0.026261677607806642\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24916201117318434,\n \"acc_stderr\": 0.014465893829859936,\n \"acc_norm\": 0.24916201117318434,\n \"acc_norm_stderr\": 0.014465893829859936\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.545751633986928,\n \"acc_stderr\": 0.028509807802626595,\n \"acc_norm\": 0.545751633986928,\n \"acc_norm_stderr\": 0.028509807802626595\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6077170418006431,\n \"acc_stderr\": 0.027731258647011998,\n \"acc_norm\": 0.6077170418006431,\n \"acc_norm_stderr\": 0.027731258647011998\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.5802469135802469,\n \"acc_stderr\": 0.027460099557005135,\n \"acc_norm\": 0.5802469135802469,\n \"acc_norm_stderr\": 0.027460099557005135\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.40425531914893614,\n \"acc_stderr\": 0.029275532159704725,\n \"acc_norm\": 0.40425531914893614,\n \"acc_norm_stderr\": 0.029275532159704725\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.43089960886571055,\n \"acc_stderr\": 0.012647695889547228,\n \"acc_norm\": 0.43089960886571055,\n \"acc_norm_stderr\": 0.012647695889547228\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.5367647058823529,\n \"acc_stderr\": 0.03029061918048569,\n \"acc_norm\": 0.5367647058823529,\n \"acc_norm_stderr\": 0.03029061918048569\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.5196078431372549,\n \"acc_stderr\": 0.020212274976302957,\n \"acc_norm\": 0.5196078431372549,\n \"acc_norm_stderr\": 0.020212274976302957\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6272727272727273,\n \"acc_stderr\": 0.04631381319425464,\n \"acc_norm\": 0.6272727272727273,\n \"acc_norm_stderr\": 0.04631381319425464\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.5102040816326531,\n \"acc_stderr\": 0.03200255347893783,\n \"acc_norm\": 0.5102040816326531,\n \"acc_norm_stderr\": 0.03200255347893783\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6965174129353234,\n \"acc_stderr\": 0.032510068164586195,\n \"acc_norm\": 0.6965174129353234,\n \"acc_norm_stderr\": 0.032510068164586195\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.045126085985421296,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.045126085985421296\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4457831325301205,\n \"acc_stderr\": 0.03869543323472101,\n \"acc_norm\": 0.4457831325301205,\n \"acc_norm_stderr\": 0.03869543323472101\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.6783625730994152,\n \"acc_stderr\": 0.03582529442573122,\n \"acc_norm\": 0.6783625730994152,\n \"acc_norm_stderr\": 0.03582529442573122\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.31701346389228885,\n \"mc1_stderr\": 0.016289203374403392,\n \"mc2\": 0.4637339879535235,\n \"mc2_stderr\": 0.014647929379084504\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7624309392265194,\n \"acc_stderr\": 0.011961298905803159\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.16906747536012132,\n \"acc_stderr\": 0.010324171445497347\n }\n}\n```", "repo_url": "https://huggingface.co/msy127/mnsim-dpo-peftmerged-2-eos", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_01T22_52_39.126509", "path": ["**/details_harness|arc:challenge|25_2024-02-01T22-52-39.126509.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-01T22-52-39.126509.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_01T22_52_39.126509", "path": ["**/details_harness|gsm8k|5_2024-02-01T22-52-39.126509.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-01T22-52-39.126509.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_01T22_52_39.126509", "path": ["**/details_harness|hellaswag|10_2024-02-01T22-52-39.126509.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-01T22-52-39.126509.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_01T22_52_39.126509", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T22-52-39.126509.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-01T22-52-39.126509.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-01T22-52-39.126509.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T22-52-39.126509.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T22-52-39.126509.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-01T22-52-39.126509.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T22-52-39.126509.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T22-52-39.126509.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T22-52-39.126509.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T22-52-39.126509.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-01T22-52-39.126509.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-01T22-52-39.126509.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T22-52-39.126509.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-01T22-52-39.126509.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T22-52-39.126509.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T22-52-39.126509.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T22-52-39.126509.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-01T22-52-39.126509.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T22-52-39.126509.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T22-52-39.126509.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T22-52-39.126509.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T22-52-39.126509.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T22-52-39.126509.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T22-52-39.126509.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T22-52-39.126509.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T22-52-39.126509.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T22-52-39.126509.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T22-52-39.126509.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T22-52-39.126509.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T22-52-39.126509.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T22-52-39.126509.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T22-52-39.126509.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-01T22-52-39.126509.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T22-52-39.126509.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-01T22-52-39.126509.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T22-52-39.126509.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T22-52-39.126509.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T22-52-39.126509.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-01T22-52-39.126509.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-01T22-52-39.126509.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T22-52-39.126509.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T22-52-39.126509.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T22-52-39.126509.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T22-52-39.126509.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-01T22-52-39.126509.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-01T22-52-39.126509.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-01T22-52-39.126509.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T22-52-39.126509.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-01T22-52-39.126509.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T22-52-39.126509.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T22-52-39.126509.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-01T22-52-39.126509.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-01T22-52-39.126509.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-01T22-52-39.126509.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T22-52-39.126509.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-01T22-52-39.126509.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-01T22-52-39.126509.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T22-52-39.126509.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-01T22-52-39.126509.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-01T22-52-39.126509.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T22-52-39.126509.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T22-52-39.126509.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-01T22-52-39.126509.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T22-52-39.126509.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T22-52-39.126509.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T22-52-39.126509.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T22-52-39.126509.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-01T22-52-39.126509.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-01T22-52-39.126509.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T22-52-39.126509.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-01T22-52-39.126509.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T22-52-39.126509.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T22-52-39.126509.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T22-52-39.126509.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-01T22-52-39.126509.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T22-52-39.126509.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T22-52-39.126509.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T22-52-39.126509.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T22-52-39.126509.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T22-52-39.126509.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T22-52-39.126509.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T22-52-39.126509.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T22-52-39.126509.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T22-52-39.126509.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T22-52-39.126509.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T22-52-39.126509.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T22-52-39.126509.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T22-52-39.126509.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T22-52-39.126509.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-01T22-52-39.126509.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T22-52-39.126509.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-01T22-52-39.126509.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T22-52-39.126509.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T22-52-39.126509.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T22-52-39.126509.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-01T22-52-39.126509.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-01T22-52-39.126509.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T22-52-39.126509.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T22-52-39.126509.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T22-52-39.126509.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T22-52-39.126509.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-01T22-52-39.126509.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-01T22-52-39.126509.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-01T22-52-39.126509.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T22-52-39.126509.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-01T22-52-39.126509.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T22-52-39.126509.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T22-52-39.126509.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-01T22-52-39.126509.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-01T22-52-39.126509.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-01T22-52-39.126509.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T22-52-39.126509.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-01T22-52-39.126509.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-01T22-52-39.126509.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_01T22_52_39.126509", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T22-52-39.126509.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T22-52-39.126509.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_01T22_52_39.126509", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-01T22-52-39.126509.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-01T22-52-39.126509.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_01T22_52_39.126509", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-01T22-52-39.126509.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-01T22-52-39.126509.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_01T22_52_39.126509", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T22-52-39.126509.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T22-52-39.126509.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_01T22_52_39.126509", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T22-52-39.126509.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T22-52-39.126509.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_01T22_52_39.126509", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-01T22-52-39.126509.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-01T22-52-39.126509.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_01T22_52_39.126509", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T22-52-39.126509.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T22-52-39.126509.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_01T22_52_39.126509", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T22-52-39.126509.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T22-52-39.126509.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_01T22_52_39.126509", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T22-52-39.126509.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T22-52-39.126509.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_01T22_52_39.126509", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T22-52-39.126509.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T22-52-39.126509.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_01T22_52_39.126509", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-01T22-52-39.126509.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-01T22-52-39.126509.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_01T22_52_39.126509", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-01T22-52-39.126509.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-01T22-52-39.126509.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_01T22_52_39.126509", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T22-52-39.126509.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T22-52-39.126509.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_01T22_52_39.126509", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-01T22-52-39.126509.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-01T22-52-39.126509.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_01T22_52_39.126509", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T22-52-39.126509.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T22-52-39.126509.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_01T22_52_39.126509", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T22-52-39.126509.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T22-52-39.126509.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_01T22_52_39.126509", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T22-52-39.126509.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T22-52-39.126509.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_01T22_52_39.126509", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-01T22-52-39.126509.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-01T22-52-39.126509.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_01T22_52_39.126509", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T22-52-39.126509.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T22-52-39.126509.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_01T22_52_39.126509", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T22-52-39.126509.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T22-52-39.126509.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_01T22_52_39.126509", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T22-52-39.126509.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T22-52-39.126509.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_01T22_52_39.126509", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T22-52-39.126509.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T22-52-39.126509.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_01T22_52_39.126509", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T22-52-39.126509.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T22-52-39.126509.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_01T22_52_39.126509", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T22-52-39.126509.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T22-52-39.126509.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_01T22_52_39.126509", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T22-52-39.126509.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T22-52-39.126509.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_01T22_52_39.126509", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T22-52-39.126509.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T22-52-39.126509.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_01T22_52_39.126509", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T22-52-39.126509.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T22-52-39.126509.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_01T22_52_39.126509", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T22-52-39.126509.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T22-52-39.126509.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_01T22_52_39.126509", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T22-52-39.126509.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T22-52-39.126509.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_01T22_52_39.126509", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T22-52-39.126509.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T22-52-39.126509.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_01T22_52_39.126509", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T22-52-39.126509.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T22-52-39.126509.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_01T22_52_39.126509", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T22-52-39.126509.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T22-52-39.126509.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_01T22_52_39.126509", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-01T22-52-39.126509.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-01T22-52-39.126509.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_01T22_52_39.126509", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T22-52-39.126509.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T22-52-39.126509.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_01T22_52_39.126509", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-01T22-52-39.126509.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-01T22-52-39.126509.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_01T22_52_39.126509", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T22-52-39.126509.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T22-52-39.126509.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_01T22_52_39.126509", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T22-52-39.126509.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T22-52-39.126509.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_01T22_52_39.126509", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T22-52-39.126509.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T22-52-39.126509.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_01T22_52_39.126509", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-01T22-52-39.126509.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-01T22-52-39.126509.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_01T22_52_39.126509", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-01T22-52-39.126509.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-01T22-52-39.126509.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_01T22_52_39.126509", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T22-52-39.126509.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T22-52-39.126509.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_01T22_52_39.126509", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T22-52-39.126509.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T22-52-39.126509.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_01T22_52_39.126509", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T22-52-39.126509.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T22-52-39.126509.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_01T22_52_39.126509", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T22-52-39.126509.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T22-52-39.126509.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_01T22_52_39.126509", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-01T22-52-39.126509.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-01T22-52-39.126509.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_01T22_52_39.126509", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-01T22-52-39.126509.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-01T22-52-39.126509.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_01T22_52_39.126509", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-01T22-52-39.126509.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-01T22-52-39.126509.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_01T22_52_39.126509", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T22-52-39.126509.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T22-52-39.126509.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_01T22_52_39.126509", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-01T22-52-39.126509.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-01T22-52-39.126509.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_01T22_52_39.126509", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T22-52-39.126509.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T22-52-39.126509.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_01T22_52_39.126509", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T22-52-39.126509.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T22-52-39.126509.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_01T22_52_39.126509", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-01T22-52-39.126509.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-01T22-52-39.126509.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_01T22_52_39.126509", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-01T22-52-39.126509.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-01T22-52-39.126509.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_01T22_52_39.126509", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-01T22-52-39.126509.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-01T22-52-39.126509.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_01T22_52_39.126509", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T22-52-39.126509.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T22-52-39.126509.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_01T22_52_39.126509", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-01T22-52-39.126509.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-01T22-52-39.126509.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_01T22_52_39.126509", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-01T22-52-39.126509.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-01T22-52-39.126509.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_01T22_52_39.126509", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-01T22-52-39.126509.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-01T22-52-39.126509.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_01T22_52_39.126509", "path": ["**/details_harness|winogrande|5_2024-02-01T22-52-39.126509.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-01T22-52-39.126509.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_01T22_52_39.126509", "path": ["results_2024-02-01T22-52-39.126509.parquet"]}, {"split": "latest", "path": ["results_2024-02-01T22-52-39.126509.parquet"]}]}]}
2024-02-01T22:55:03+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of msy127/mnsim-dpo-peftmerged-2-eos Dataset automatically created during the evaluation run of model msy127/mnsim-dpo-peftmerged-2-eos on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-02-01T22:52:39.126509(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of msy127/mnsim-dpo-peftmerged-2-eos\n\n\n\nDataset automatically created during the evaluation run of model msy127/mnsim-dpo-peftmerged-2-eos on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-02-01T22:52:39.126509(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of msy127/mnsim-dpo-peftmerged-2-eos\n\n\n\nDataset automatically created during the evaluation run of model msy127/mnsim-dpo-peftmerged-2-eos on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-02-01T22:52:39.126509(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
7676bb9b459ca0a23cf460a3dc00e50cdf8feca5
# Dataset Card for Evaluation run of h2oai/h2o-danube-1.8b-base <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [h2oai/h2o-danube-1.8b-base](https://huggingface.co/h2oai/h2o-danube-1.8b-base) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_h2oai__h2o-danube-1.8b-base", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-02-01T22:53:48.852088](https://huggingface.co/datasets/open-llm-leaderboard/details_h2oai__h2o-danube-1.8b-base/blob/main/results_2024-02-01T22-53-48.852088.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.26739343781347724, "acc_stderr": 0.031037633875846887, "acc_norm": 0.2690397947420433, "acc_norm_stderr": 0.03180448205346714, "mc1": 0.20195838433292534, "mc1_stderr": 0.014053957441512348, "mc2": 0.3386425348954068, "mc2_stderr": 0.01334349743426728 }, "harness|arc:challenge|25": { "acc": 0.35494880546075086, "acc_stderr": 0.013983036904094094, "acc_norm": 0.39419795221843, "acc_norm_stderr": 0.014280522667467325 }, "harness|hellaswag|10": { "acc": 0.5134435371439953, "acc_stderr": 0.004987977492042154, "acc_norm": 0.6957777335192192, "acc_norm_stderr": 0.004591369853276529 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.26, "acc_stderr": 0.04408440022768081, "acc_norm": 0.26, "acc_norm_stderr": 0.04408440022768081 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.2814814814814815, "acc_stderr": 0.03885004245800255, "acc_norm": 0.2814814814814815, "acc_norm_stderr": 0.03885004245800255 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.3092105263157895, "acc_stderr": 0.03761070869867479, "acc_norm": 0.3092105263157895, "acc_norm_stderr": 0.03761070869867479 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.2, "acc_stderr": 0.04020151261036845, "acc_norm": 0.2, "acc_norm_stderr": 0.04020151261036845 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.21509433962264152, "acc_stderr": 0.02528839450289137, "acc_norm": 0.21509433962264152, "acc_norm_stderr": 0.02528839450289137 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.2569444444444444, "acc_stderr": 0.03653946969442099, "acc_norm": 0.2569444444444444, "acc_norm_stderr": 0.03653946969442099 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.18, "acc_stderr": 0.03861229196653694, "acc_norm": 0.18, "acc_norm_stderr": 0.03861229196653694 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.26, "acc_stderr": 0.0440844002276808, "acc_norm": 0.26, "acc_norm_stderr": 0.0440844002276808 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.25, "acc_stderr": 0.04351941398892446, "acc_norm": 0.25, "acc_norm_stderr": 0.04351941398892446 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.27167630057803466, "acc_stderr": 0.03391750322321659, "acc_norm": 0.27167630057803466, "acc_norm_stderr": 0.03391750322321659 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.20588235294117646, "acc_stderr": 0.04023382273617747, "acc_norm": 0.20588235294117646, "acc_norm_stderr": 0.04023382273617747 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.32, "acc_stderr": 0.046882617226215034, "acc_norm": 0.32, "acc_norm_stderr": 0.046882617226215034 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.19148936170212766, "acc_stderr": 0.025722149992637795, "acc_norm": 0.19148936170212766, "acc_norm_stderr": 0.025722149992637795 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.2543859649122807, "acc_stderr": 0.040969851398436695, "acc_norm": 0.2543859649122807, "acc_norm_stderr": 0.040969851398436695 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.296551724137931, "acc_stderr": 0.03806142687309994, "acc_norm": 0.296551724137931, "acc_norm_stderr": 0.03806142687309994 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.2619047619047619, "acc_stderr": 0.02264421261552521, "acc_norm": 0.2619047619047619, "acc_norm_stderr": 0.02264421261552521 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.19047619047619047, "acc_stderr": 0.03512207412302054, "acc_norm": 0.19047619047619047, "acc_norm_stderr": 0.03512207412302054 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.28, "acc_stderr": 0.04512608598542127, "acc_norm": 0.28, "acc_norm_stderr": 0.04512608598542127 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.267741935483871, "acc_stderr": 0.025189006660212388, "acc_norm": 0.267741935483871, "acc_norm_stderr": 0.025189006660212388 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.3103448275862069, "acc_stderr": 0.03255086769970103, "acc_norm": 0.3103448275862069, "acc_norm_stderr": 0.03255086769970103 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.31, "acc_stderr": 0.04648231987117316, "acc_norm": 0.31, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.28484848484848485, "acc_stderr": 0.035243908445117836, "acc_norm": 0.28484848484848485, "acc_norm_stderr": 0.035243908445117836 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.23737373737373738, "acc_stderr": 0.03031371053819889, "acc_norm": 0.23737373737373738, "acc_norm_stderr": 0.03031371053819889 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.19689119170984457, "acc_stderr": 0.02869787397186069, "acc_norm": 0.19689119170984457, "acc_norm_stderr": 0.02869787397186069 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.23076923076923078, "acc_stderr": 0.021362027725222717, "acc_norm": 0.23076923076923078, "acc_norm_stderr": 0.021362027725222717 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.26296296296296295, "acc_stderr": 0.02684205787383371, "acc_norm": 0.26296296296296295, "acc_norm_stderr": 0.02684205787383371 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.18067226890756302, "acc_stderr": 0.024991964966600753, "acc_norm": 0.18067226890756302, "acc_norm_stderr": 0.024991964966600753 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.2781456953642384, "acc_stderr": 0.03658603262763743, "acc_norm": 0.2781456953642384, "acc_norm_stderr": 0.03658603262763743 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.21834862385321102, "acc_stderr": 0.017712600528722738, "acc_norm": 0.21834862385321102, "acc_norm_stderr": 0.017712600528722738 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.36574074074074076, "acc_stderr": 0.03284738857647205, "acc_norm": 0.36574074074074076, "acc_norm_stderr": 0.03284738857647205 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.25980392156862747, "acc_stderr": 0.030778554678693264, "acc_norm": 0.25980392156862747, "acc_norm_stderr": 0.030778554678693264 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.25316455696202533, "acc_stderr": 0.028304657943035303, "acc_norm": 0.25316455696202533, "acc_norm_stderr": 0.028304657943035303 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.2242152466367713, "acc_stderr": 0.027991534258519527, "acc_norm": 0.2242152466367713, "acc_norm_stderr": 0.027991534258519527 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.22137404580152673, "acc_stderr": 0.0364129708131373, "acc_norm": 0.22137404580152673, "acc_norm_stderr": 0.0364129708131373 }, "harness|hendrycksTest-international_law|5": { "acc": 0.2727272727272727, "acc_stderr": 0.04065578140908705, "acc_norm": 0.2727272727272727, "acc_norm_stderr": 0.04065578140908705 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.23148148148148148, "acc_stderr": 0.04077494709252626, "acc_norm": 0.23148148148148148, "acc_norm_stderr": 0.04077494709252626 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.3006134969325153, "acc_stderr": 0.03602511318806771, "acc_norm": 0.3006134969325153, "acc_norm_stderr": 0.03602511318806771 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.23214285714285715, "acc_stderr": 0.040073418097558065, "acc_norm": 0.23214285714285715, "acc_norm_stderr": 0.040073418097558065 }, "harness|hendrycksTest-management|5": { "acc": 0.1650485436893204, "acc_stderr": 0.036756688322331886, "acc_norm": 0.1650485436893204, "acc_norm_stderr": 0.036756688322331886 }, "harness|hendrycksTest-marketing|5": { "acc": 0.2777777777777778, "acc_stderr": 0.029343114798094448, "acc_norm": 0.2777777777777778, "acc_norm_stderr": 0.029343114798094448 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.22, "acc_stderr": 0.041633319989322695, "acc_norm": 0.22, "acc_norm_stderr": 0.041633319989322695 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.2515964240102171, "acc_stderr": 0.015517322365529614, "acc_norm": 0.2515964240102171, "acc_norm_stderr": 0.015517322365529614 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.2658959537572254, "acc_stderr": 0.023786203255508283, "acc_norm": 0.2658959537572254, "acc_norm_stderr": 0.023786203255508283 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.24692737430167597, "acc_stderr": 0.014422292204808835, "acc_norm": 0.24692737430167597, "acc_norm_stderr": 0.014422292204808835 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.24836601307189543, "acc_stderr": 0.02473998135511359, "acc_norm": 0.24836601307189543, "acc_norm_stderr": 0.02473998135511359 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.3054662379421222, "acc_stderr": 0.02616058445014049, "acc_norm": 0.3054662379421222, "acc_norm_stderr": 0.02616058445014049 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.28703703703703703, "acc_stderr": 0.02517104191530968, "acc_norm": 0.28703703703703703, "acc_norm_stderr": 0.02517104191530968 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.2801418439716312, "acc_stderr": 0.026789172351140242, "acc_norm": 0.2801418439716312, "acc_norm_stderr": 0.026789172351140242 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.23663624511082137, "acc_stderr": 0.010855137351572728, "acc_norm": 0.23663624511082137, "acc_norm_stderr": 0.010855137351572728 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.39338235294117646, "acc_stderr": 0.02967428828131118, "acc_norm": 0.39338235294117646, "acc_norm_stderr": 0.02967428828131118 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.24836601307189543, "acc_stderr": 0.017479487001364764, "acc_norm": 0.24836601307189543, "acc_norm_stderr": 0.017479487001364764 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.22727272727272727, "acc_stderr": 0.04013964554072775, "acc_norm": 0.22727272727272727, "acc_norm_stderr": 0.04013964554072775 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.37551020408163266, "acc_stderr": 0.031001209039894843, "acc_norm": 0.37551020408163266, "acc_norm_stderr": 0.031001209039894843 }, "harness|hendrycksTest-sociology|5": { "acc": 0.24378109452736318, "acc_stderr": 0.030360490154014652, "acc_norm": 0.24378109452736318, "acc_norm_stderr": 0.030360490154014652 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.31, "acc_stderr": 0.04648231987117316, "acc_norm": 0.31, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-virology|5": { "acc": 0.2891566265060241, "acc_stderr": 0.03529486801511115, "acc_norm": 0.2891566265060241, "acc_norm_stderr": 0.03529486801511115 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.28654970760233917, "acc_stderr": 0.03467826685703826, "acc_norm": 0.28654970760233917, "acc_norm_stderr": 0.03467826685703826 }, "harness|truthfulqa:mc|0": { "mc1": 0.20195838433292534, "mc1_stderr": 0.014053957441512348, "mc2": 0.3386425348954068, "mc2_stderr": 0.01334349743426728 }, "harness|winogrande|5": { "acc": 0.6448303078137332, "acc_stderr": 0.013450047479569254 }, "harness|gsm8k|5": { "acc": 0.014404852160727824, "acc_stderr": 0.003282055917136914 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_h2oai__h2o-danube-1.8b-base
[ "region:us" ]
2024-02-01T22:56:12+00:00
{"pretty_name": "Evaluation run of h2oai/h2o-danube-1.8b-base", "dataset_summary": "Dataset automatically created during the evaluation run of model [h2oai/h2o-danube-1.8b-base](https://huggingface.co/h2oai/h2o-danube-1.8b-base) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_h2oai__h2o-danube-1.8b-base\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-01T22:53:48.852088](https://huggingface.co/datasets/open-llm-leaderboard/details_h2oai__h2o-danube-1.8b-base/blob/main/results_2024-02-01T22-53-48.852088.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.26739343781347724,\n \"acc_stderr\": 0.031037633875846887,\n \"acc_norm\": 0.2690397947420433,\n \"acc_norm_stderr\": 0.03180448205346714,\n \"mc1\": 0.20195838433292534,\n \"mc1_stderr\": 0.014053957441512348,\n \"mc2\": 0.3386425348954068,\n \"mc2_stderr\": 0.01334349743426728\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.35494880546075086,\n \"acc_stderr\": 0.013983036904094094,\n \"acc_norm\": 0.39419795221843,\n \"acc_norm_stderr\": 0.014280522667467325\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5134435371439953,\n \"acc_stderr\": 0.004987977492042154,\n \"acc_norm\": 0.6957777335192192,\n \"acc_norm_stderr\": 0.004591369853276529\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768081,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768081\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.2814814814814815,\n \"acc_stderr\": 0.03885004245800255,\n \"acc_norm\": 0.2814814814814815,\n \"acc_norm_stderr\": 0.03885004245800255\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.3092105263157895,\n \"acc_stderr\": 0.03761070869867479,\n \"acc_norm\": 0.3092105263157895,\n \"acc_norm_stderr\": 0.03761070869867479\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.2,\n \"acc_stderr\": 0.04020151261036845,\n \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.04020151261036845\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.21509433962264152,\n \"acc_stderr\": 0.02528839450289137,\n \"acc_norm\": 0.21509433962264152,\n \"acc_norm_stderr\": 0.02528839450289137\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2569444444444444,\n \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.2569444444444444,\n \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.18,\n \"acc_stderr\": 0.03861229196653694,\n \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.03861229196653694\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.27167630057803466,\n \"acc_stderr\": 0.03391750322321659,\n \"acc_norm\": 0.27167630057803466,\n \"acc_norm_stderr\": 0.03391750322321659\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.20588235294117646,\n \"acc_stderr\": 0.04023382273617747,\n \"acc_norm\": 0.20588235294117646,\n \"acc_norm_stderr\": 0.04023382273617747\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.19148936170212766,\n \"acc_stderr\": 0.025722149992637795,\n \"acc_norm\": 0.19148936170212766,\n \"acc_norm_stderr\": 0.025722149992637795\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2543859649122807,\n \"acc_stderr\": 0.040969851398436695,\n \"acc_norm\": 0.2543859649122807,\n \"acc_norm_stderr\": 0.040969851398436695\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.296551724137931,\n \"acc_stderr\": 0.03806142687309994,\n \"acc_norm\": 0.296551724137931,\n \"acc_norm_stderr\": 0.03806142687309994\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.2619047619047619,\n \"acc_stderr\": 0.02264421261552521,\n \"acc_norm\": 0.2619047619047619,\n \"acc_norm_stderr\": 0.02264421261552521\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.19047619047619047,\n \"acc_stderr\": 0.03512207412302054,\n \"acc_norm\": 0.19047619047619047,\n \"acc_norm_stderr\": 0.03512207412302054\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.267741935483871,\n \"acc_stderr\": 0.025189006660212388,\n \"acc_norm\": 0.267741935483871,\n \"acc_norm_stderr\": 0.025189006660212388\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.3103448275862069,\n \"acc_stderr\": 0.03255086769970103,\n \"acc_norm\": 0.3103448275862069,\n \"acc_norm_stderr\": 0.03255086769970103\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.28484848484848485,\n \"acc_stderr\": 0.035243908445117836,\n \"acc_norm\": 0.28484848484848485,\n \"acc_norm_stderr\": 0.035243908445117836\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.23737373737373738,\n \"acc_stderr\": 0.03031371053819889,\n \"acc_norm\": 0.23737373737373738,\n \"acc_norm_stderr\": 0.03031371053819889\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.19689119170984457,\n \"acc_stderr\": 0.02869787397186069,\n \"acc_norm\": 0.19689119170984457,\n \"acc_norm_stderr\": 0.02869787397186069\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.23076923076923078,\n \"acc_stderr\": 0.021362027725222717,\n \"acc_norm\": 0.23076923076923078,\n \"acc_norm_stderr\": 0.021362027725222717\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.26296296296296295,\n \"acc_stderr\": 0.02684205787383371,\n \"acc_norm\": 0.26296296296296295,\n \"acc_norm_stderr\": 0.02684205787383371\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.18067226890756302,\n \"acc_stderr\": 0.024991964966600753,\n \"acc_norm\": 0.18067226890756302,\n \"acc_norm_stderr\": 0.024991964966600753\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.2781456953642384,\n \"acc_stderr\": 0.03658603262763743,\n \"acc_norm\": 0.2781456953642384,\n \"acc_norm_stderr\": 0.03658603262763743\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.21834862385321102,\n \"acc_stderr\": 0.017712600528722738,\n \"acc_norm\": 0.21834862385321102,\n \"acc_norm_stderr\": 0.017712600528722738\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.36574074074074076,\n \"acc_stderr\": 0.03284738857647205,\n \"acc_norm\": 0.36574074074074076,\n \"acc_norm_stderr\": 0.03284738857647205\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.25980392156862747,\n \"acc_stderr\": 0.030778554678693264,\n \"acc_norm\": 0.25980392156862747,\n \"acc_norm_stderr\": 0.030778554678693264\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.25316455696202533,\n \"acc_stderr\": 0.028304657943035303,\n \"acc_norm\": 0.25316455696202533,\n \"acc_norm_stderr\": 0.028304657943035303\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.2242152466367713,\n \"acc_stderr\": 0.027991534258519527,\n \"acc_norm\": 0.2242152466367713,\n \"acc_norm_stderr\": 0.027991534258519527\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.22137404580152673,\n \"acc_stderr\": 0.0364129708131373,\n \"acc_norm\": 0.22137404580152673,\n \"acc_norm_stderr\": 0.0364129708131373\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.2727272727272727,\n \"acc_stderr\": 0.04065578140908705,\n \"acc_norm\": 0.2727272727272727,\n \"acc_norm_stderr\": 0.04065578140908705\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.23148148148148148,\n \"acc_stderr\": 0.04077494709252626,\n \"acc_norm\": 0.23148148148148148,\n \"acc_norm_stderr\": 0.04077494709252626\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.3006134969325153,\n \"acc_stderr\": 0.03602511318806771,\n \"acc_norm\": 0.3006134969325153,\n \"acc_norm_stderr\": 0.03602511318806771\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.23214285714285715,\n \"acc_stderr\": 0.040073418097558065,\n \"acc_norm\": 0.23214285714285715,\n \"acc_norm_stderr\": 0.040073418097558065\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.1650485436893204,\n \"acc_stderr\": 0.036756688322331886,\n \"acc_norm\": 0.1650485436893204,\n \"acc_norm_stderr\": 0.036756688322331886\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2777777777777778,\n \"acc_stderr\": 0.029343114798094448,\n \"acc_norm\": 0.2777777777777778,\n \"acc_norm_stderr\": 0.029343114798094448\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.22,\n \"acc_stderr\": 0.041633319989322695,\n \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.041633319989322695\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.2515964240102171,\n \"acc_stderr\": 0.015517322365529614,\n \"acc_norm\": 0.2515964240102171,\n \"acc_norm_stderr\": 0.015517322365529614\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.2658959537572254,\n \"acc_stderr\": 0.023786203255508283,\n \"acc_norm\": 0.2658959537572254,\n \"acc_norm_stderr\": 0.023786203255508283\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24692737430167597,\n \"acc_stderr\": 0.014422292204808835,\n \"acc_norm\": 0.24692737430167597,\n \"acc_norm_stderr\": 0.014422292204808835\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.24836601307189543,\n \"acc_stderr\": 0.02473998135511359,\n \"acc_norm\": 0.24836601307189543,\n \"acc_norm_stderr\": 0.02473998135511359\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.3054662379421222,\n \"acc_stderr\": 0.02616058445014049,\n \"acc_norm\": 0.3054662379421222,\n \"acc_norm_stderr\": 0.02616058445014049\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.28703703703703703,\n \"acc_stderr\": 0.02517104191530968,\n \"acc_norm\": 0.28703703703703703,\n \"acc_norm_stderr\": 0.02517104191530968\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.2801418439716312,\n \"acc_stderr\": 0.026789172351140242,\n \"acc_norm\": 0.2801418439716312,\n \"acc_norm_stderr\": 0.026789172351140242\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.23663624511082137,\n \"acc_stderr\": 0.010855137351572728,\n \"acc_norm\": 0.23663624511082137,\n \"acc_norm_stderr\": 0.010855137351572728\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.39338235294117646,\n \"acc_stderr\": 0.02967428828131118,\n \"acc_norm\": 0.39338235294117646,\n \"acc_norm_stderr\": 0.02967428828131118\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.24836601307189543,\n \"acc_stderr\": 0.017479487001364764,\n \"acc_norm\": 0.24836601307189543,\n \"acc_norm_stderr\": 0.017479487001364764\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.22727272727272727,\n \"acc_stderr\": 0.04013964554072775,\n \"acc_norm\": 0.22727272727272727,\n \"acc_norm_stderr\": 0.04013964554072775\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.37551020408163266,\n \"acc_stderr\": 0.031001209039894843,\n \"acc_norm\": 0.37551020408163266,\n \"acc_norm_stderr\": 0.031001209039894843\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.24378109452736318,\n \"acc_stderr\": 0.030360490154014652,\n \"acc_norm\": 0.24378109452736318,\n \"acc_norm_stderr\": 0.030360490154014652\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.2891566265060241,\n \"acc_stderr\": 0.03529486801511115,\n \"acc_norm\": 0.2891566265060241,\n \"acc_norm_stderr\": 0.03529486801511115\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.28654970760233917,\n \"acc_stderr\": 0.03467826685703826,\n \"acc_norm\": 0.28654970760233917,\n \"acc_norm_stderr\": 0.03467826685703826\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.20195838433292534,\n \"mc1_stderr\": 0.014053957441512348,\n \"mc2\": 0.3386425348954068,\n \"mc2_stderr\": 0.01334349743426728\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.6448303078137332,\n \"acc_stderr\": 0.013450047479569254\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.014404852160727824,\n \"acc_stderr\": 0.003282055917136914\n }\n}\n```", "repo_url": "https://huggingface.co/h2oai/h2o-danube-1.8b-base", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_01T22_53_48.852088", "path": ["**/details_harness|arc:challenge|25_2024-02-01T22-53-48.852088.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-01T22-53-48.852088.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_01T22_53_48.852088", "path": ["**/details_harness|gsm8k|5_2024-02-01T22-53-48.852088.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-01T22-53-48.852088.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_01T22_53_48.852088", "path": ["**/details_harness|hellaswag|10_2024-02-01T22-53-48.852088.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-01T22-53-48.852088.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_01T22_53_48.852088", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T22-53-48.852088.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-01T22-53-48.852088.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-01T22-53-48.852088.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T22-53-48.852088.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T22-53-48.852088.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-01T22-53-48.852088.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T22-53-48.852088.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T22-53-48.852088.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T22-53-48.852088.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T22-53-48.852088.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-01T22-53-48.852088.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-01T22-53-48.852088.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T22-53-48.852088.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-01T22-53-48.852088.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T22-53-48.852088.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T22-53-48.852088.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T22-53-48.852088.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-01T22-53-48.852088.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T22-53-48.852088.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T22-53-48.852088.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T22-53-48.852088.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T22-53-48.852088.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T22-53-48.852088.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T22-53-48.852088.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T22-53-48.852088.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T22-53-48.852088.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T22-53-48.852088.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T22-53-48.852088.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T22-53-48.852088.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T22-53-48.852088.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T22-53-48.852088.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T22-53-48.852088.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-01T22-53-48.852088.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T22-53-48.852088.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-01T22-53-48.852088.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T22-53-48.852088.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T22-53-48.852088.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T22-53-48.852088.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-01T22-53-48.852088.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-01T22-53-48.852088.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T22-53-48.852088.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T22-53-48.852088.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T22-53-48.852088.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T22-53-48.852088.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-01T22-53-48.852088.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-01T22-53-48.852088.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-01T22-53-48.852088.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T22-53-48.852088.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-01T22-53-48.852088.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T22-53-48.852088.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T22-53-48.852088.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-01T22-53-48.852088.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-01T22-53-48.852088.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-01T22-53-48.852088.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T22-53-48.852088.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-01T22-53-48.852088.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-01T22-53-48.852088.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T22-53-48.852088.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-01T22-53-48.852088.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-01T22-53-48.852088.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T22-53-48.852088.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T22-53-48.852088.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-01T22-53-48.852088.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T22-53-48.852088.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T22-53-48.852088.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T22-53-48.852088.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T22-53-48.852088.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-01T22-53-48.852088.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-01T22-53-48.852088.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T22-53-48.852088.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-01T22-53-48.852088.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T22-53-48.852088.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T22-53-48.852088.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T22-53-48.852088.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-01T22-53-48.852088.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T22-53-48.852088.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T22-53-48.852088.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T22-53-48.852088.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T22-53-48.852088.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T22-53-48.852088.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T22-53-48.852088.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T22-53-48.852088.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T22-53-48.852088.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T22-53-48.852088.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T22-53-48.852088.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T22-53-48.852088.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T22-53-48.852088.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T22-53-48.852088.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T22-53-48.852088.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-01T22-53-48.852088.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T22-53-48.852088.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-01T22-53-48.852088.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T22-53-48.852088.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T22-53-48.852088.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T22-53-48.852088.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-01T22-53-48.852088.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-01T22-53-48.852088.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T22-53-48.852088.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T22-53-48.852088.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T22-53-48.852088.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T22-53-48.852088.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-01T22-53-48.852088.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-01T22-53-48.852088.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-01T22-53-48.852088.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T22-53-48.852088.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-01T22-53-48.852088.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T22-53-48.852088.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T22-53-48.852088.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-01T22-53-48.852088.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-01T22-53-48.852088.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-01T22-53-48.852088.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T22-53-48.852088.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-01T22-53-48.852088.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-01T22-53-48.852088.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_01T22_53_48.852088", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T22-53-48.852088.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T22-53-48.852088.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_01T22_53_48.852088", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-01T22-53-48.852088.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-01T22-53-48.852088.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_01T22_53_48.852088", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-01T22-53-48.852088.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-01T22-53-48.852088.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_01T22_53_48.852088", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T22-53-48.852088.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T22-53-48.852088.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_01T22_53_48.852088", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T22-53-48.852088.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T22-53-48.852088.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_01T22_53_48.852088", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-01T22-53-48.852088.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-01T22-53-48.852088.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_01T22_53_48.852088", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T22-53-48.852088.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T22-53-48.852088.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_01T22_53_48.852088", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T22-53-48.852088.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T22-53-48.852088.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_01T22_53_48.852088", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T22-53-48.852088.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T22-53-48.852088.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_01T22_53_48.852088", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T22-53-48.852088.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T22-53-48.852088.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_01T22_53_48.852088", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-01T22-53-48.852088.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-01T22-53-48.852088.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_01T22_53_48.852088", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-01T22-53-48.852088.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-01T22-53-48.852088.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_01T22_53_48.852088", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T22-53-48.852088.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T22-53-48.852088.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_01T22_53_48.852088", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-01T22-53-48.852088.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-01T22-53-48.852088.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_01T22_53_48.852088", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T22-53-48.852088.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T22-53-48.852088.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_01T22_53_48.852088", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T22-53-48.852088.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T22-53-48.852088.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_01T22_53_48.852088", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T22-53-48.852088.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T22-53-48.852088.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_01T22_53_48.852088", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-01T22-53-48.852088.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-01T22-53-48.852088.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_01T22_53_48.852088", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T22-53-48.852088.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T22-53-48.852088.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_01T22_53_48.852088", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T22-53-48.852088.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T22-53-48.852088.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_01T22_53_48.852088", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T22-53-48.852088.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T22-53-48.852088.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_01T22_53_48.852088", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T22-53-48.852088.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T22-53-48.852088.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_01T22_53_48.852088", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T22-53-48.852088.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T22-53-48.852088.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_01T22_53_48.852088", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T22-53-48.852088.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T22-53-48.852088.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_01T22_53_48.852088", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T22-53-48.852088.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T22-53-48.852088.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_01T22_53_48.852088", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T22-53-48.852088.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T22-53-48.852088.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_01T22_53_48.852088", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T22-53-48.852088.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T22-53-48.852088.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_01T22_53_48.852088", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T22-53-48.852088.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T22-53-48.852088.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_01T22_53_48.852088", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T22-53-48.852088.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T22-53-48.852088.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_01T22_53_48.852088", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T22-53-48.852088.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T22-53-48.852088.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_01T22_53_48.852088", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T22-53-48.852088.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T22-53-48.852088.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_01T22_53_48.852088", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T22-53-48.852088.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T22-53-48.852088.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_01T22_53_48.852088", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-01T22-53-48.852088.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-01T22-53-48.852088.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_01T22_53_48.852088", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T22-53-48.852088.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T22-53-48.852088.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_01T22_53_48.852088", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-01T22-53-48.852088.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-01T22-53-48.852088.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_01T22_53_48.852088", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T22-53-48.852088.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T22-53-48.852088.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_01T22_53_48.852088", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T22-53-48.852088.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T22-53-48.852088.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_01T22_53_48.852088", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T22-53-48.852088.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T22-53-48.852088.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_01T22_53_48.852088", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-01T22-53-48.852088.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-01T22-53-48.852088.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_01T22_53_48.852088", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-01T22-53-48.852088.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-01T22-53-48.852088.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_01T22_53_48.852088", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T22-53-48.852088.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T22-53-48.852088.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_01T22_53_48.852088", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T22-53-48.852088.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T22-53-48.852088.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_01T22_53_48.852088", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T22-53-48.852088.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T22-53-48.852088.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_01T22_53_48.852088", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T22-53-48.852088.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T22-53-48.852088.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_01T22_53_48.852088", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-01T22-53-48.852088.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-01T22-53-48.852088.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_01T22_53_48.852088", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-01T22-53-48.852088.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-01T22-53-48.852088.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_01T22_53_48.852088", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-01T22-53-48.852088.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-01T22-53-48.852088.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_01T22_53_48.852088", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T22-53-48.852088.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T22-53-48.852088.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_01T22_53_48.852088", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-01T22-53-48.852088.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-01T22-53-48.852088.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_01T22_53_48.852088", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T22-53-48.852088.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T22-53-48.852088.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_01T22_53_48.852088", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T22-53-48.852088.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T22-53-48.852088.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_01T22_53_48.852088", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-01T22-53-48.852088.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-01T22-53-48.852088.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_01T22_53_48.852088", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-01T22-53-48.852088.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-01T22-53-48.852088.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_01T22_53_48.852088", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-01T22-53-48.852088.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-01T22-53-48.852088.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_01T22_53_48.852088", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T22-53-48.852088.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T22-53-48.852088.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_01T22_53_48.852088", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-01T22-53-48.852088.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-01T22-53-48.852088.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_01T22_53_48.852088", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-01T22-53-48.852088.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-01T22-53-48.852088.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_01T22_53_48.852088", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-01T22-53-48.852088.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-01T22-53-48.852088.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_01T22_53_48.852088", "path": ["**/details_harness|winogrande|5_2024-02-01T22-53-48.852088.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-01T22-53-48.852088.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_01T22_53_48.852088", "path": ["results_2024-02-01T22-53-48.852088.parquet"]}, {"split": "latest", "path": ["results_2024-02-01T22-53-48.852088.parquet"]}]}]}
2024-02-01T22:56:36+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of h2oai/h2o-danube-1.8b-base Dataset automatically created during the evaluation run of model h2oai/h2o-danube-1.8b-base on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-02-01T22:53:48.852088(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of h2oai/h2o-danube-1.8b-base\n\n\n\nDataset automatically created during the evaluation run of model h2oai/h2o-danube-1.8b-base on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-02-01T22:53:48.852088(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of h2oai/h2o-danube-1.8b-base\n\n\n\nDataset automatically created during the evaluation run of model h2oai/h2o-danube-1.8b-base on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-02-01T22:53:48.852088(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
fee36bddc876aa223245d67b42b941b0f8d79a8c
# Dataset Card for Evaluation run of h2oai/h2o-danube-1.8b-sft <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [h2oai/h2o-danube-1.8b-sft](https://huggingface.co/h2oai/h2o-danube-1.8b-sft) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_h2oai__h2o-danube-1.8b-sft", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-02-01T22:54:49.142615](https://huggingface.co/datasets/open-llm-leaderboard/details_h2oai__h2o-danube-1.8b-sft/blob/main/results_2024-02-01T22-54-49.142615.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.3428374590678907, "acc_stderr": 0.033339599143861524, "acc_norm": 0.34426324885865267, "acc_norm_stderr": 0.03407406752430132, "mc1": 0.2533659730722154, "mc1_stderr": 0.015225899340826854, "mc2": 0.4028619731190418, "mc2_stderr": 0.01428278746898766 }, "harness|arc:challenge|25": { "acc": 0.37372013651877134, "acc_stderr": 0.014137708601759098, "acc_norm": 0.40187713310580203, "acc_norm_stderr": 0.01432726861457827 }, "harness|hellaswag|10": { "acc": 0.49790878311093406, "acc_stderr": 0.004989737768749943, "acc_norm": 0.6733718382792272, "acc_norm_stderr": 0.004680215003395913 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.27, "acc_stderr": 0.044619604333847415, "acc_norm": 0.27, "acc_norm_stderr": 0.044619604333847415 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.4222222222222222, "acc_stderr": 0.042667634040995814, "acc_norm": 0.4222222222222222, "acc_norm_stderr": 0.042667634040995814 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.2631578947368421, "acc_stderr": 0.03583496176361061, "acc_norm": 0.2631578947368421, "acc_norm_stderr": 0.03583496176361061 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.39, "acc_stderr": 0.04902071300001975, "acc_norm": 0.39, "acc_norm_stderr": 0.04902071300001975 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.3886792452830189, "acc_stderr": 0.030000485448675986, "acc_norm": 0.3886792452830189, "acc_norm_stderr": 0.030000485448675986 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.2847222222222222, "acc_stderr": 0.03773809990686934, "acc_norm": 0.2847222222222222, "acc_norm_stderr": 0.03773809990686934 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.23, "acc_stderr": 0.04229525846816506, "acc_norm": 0.23, "acc_norm_stderr": 0.04229525846816506 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.27, "acc_stderr": 0.044619604333847394, "acc_norm": 0.27, "acc_norm_stderr": 0.044619604333847394 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.24, "acc_stderr": 0.04292346959909282, "acc_norm": 0.24, "acc_norm_stderr": 0.04292346959909282 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.32947976878612717, "acc_stderr": 0.03583901754736411, "acc_norm": 0.32947976878612717, "acc_norm_stderr": 0.03583901754736411 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.16666666666666666, "acc_stderr": 0.03708284662416545, "acc_norm": 0.16666666666666666, "acc_norm_stderr": 0.03708284662416545 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.38, "acc_stderr": 0.04878317312145632, "acc_norm": 0.38, "acc_norm_stderr": 0.04878317312145632 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.24680851063829787, "acc_stderr": 0.028185441301234092, "acc_norm": 0.24680851063829787, "acc_norm_stderr": 0.028185441301234092 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.2543859649122807, "acc_stderr": 0.040969851398436716, "acc_norm": 0.2543859649122807, "acc_norm_stderr": 0.040969851398436716 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.4206896551724138, "acc_stderr": 0.0411391498118926, "acc_norm": 0.4206896551724138, "acc_norm_stderr": 0.0411391498118926 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.25132275132275134, "acc_stderr": 0.022340482339643895, "acc_norm": 0.25132275132275134, "acc_norm_stderr": 0.022340482339643895 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.23809523809523808, "acc_stderr": 0.038095238095238106, "acc_norm": 0.23809523809523808, "acc_norm_stderr": 0.038095238095238106 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.36, "acc_stderr": 0.048241815132442176, "acc_norm": 0.36, "acc_norm_stderr": 0.048241815132442176 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.4, "acc_stderr": 0.027869320571664635, "acc_norm": 0.4, "acc_norm_stderr": 0.027869320571664635 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.24630541871921183, "acc_stderr": 0.030315099285617732, "acc_norm": 0.24630541871921183, "acc_norm_stderr": 0.030315099285617732 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.34, "acc_stderr": 0.04760952285695235, "acc_norm": 0.34, "acc_norm_stderr": 0.04760952285695235 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.44242424242424244, "acc_stderr": 0.038783721137112745, "acc_norm": 0.44242424242424244, "acc_norm_stderr": 0.038783721137112745 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.4444444444444444, "acc_stderr": 0.035402943770953675, "acc_norm": 0.4444444444444444, "acc_norm_stderr": 0.035402943770953675 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.41450777202072536, "acc_stderr": 0.03555300319557673, "acc_norm": 0.41450777202072536, "acc_norm_stderr": 0.03555300319557673 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.30256410256410254, "acc_stderr": 0.023290888053772735, "acc_norm": 0.30256410256410254, "acc_norm_stderr": 0.023290888053772735 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.2777777777777778, "acc_stderr": 0.02730914058823018, "acc_norm": 0.2777777777777778, "acc_norm_stderr": 0.02730914058823018 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.2815126050420168, "acc_stderr": 0.029213549414372153, "acc_norm": 0.2815126050420168, "acc_norm_stderr": 0.029213549414372153 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.2582781456953642, "acc_stderr": 0.035737053147634576, "acc_norm": 0.2582781456953642, "acc_norm_stderr": 0.035737053147634576 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.344954128440367, "acc_stderr": 0.020380605405066962, "acc_norm": 0.344954128440367, "acc_norm_stderr": 0.020380605405066962 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.30092592592592593, "acc_stderr": 0.031280390843298825, "acc_norm": 0.30092592592592593, "acc_norm_stderr": 0.031280390843298825 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.3333333333333333, "acc_stderr": 0.03308611113236436, "acc_norm": 0.3333333333333333, "acc_norm_stderr": 0.03308611113236436 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.4430379746835443, "acc_stderr": 0.03233532777533484, "acc_norm": 0.4430379746835443, "acc_norm_stderr": 0.03233532777533484 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.3811659192825112, "acc_stderr": 0.03259625118416828, "acc_norm": 0.3811659192825112, "acc_norm_stderr": 0.03259625118416828 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.366412213740458, "acc_stderr": 0.04225875451969638, "acc_norm": 0.366412213740458, "acc_norm_stderr": 0.04225875451969638 }, "harness|hendrycksTest-international_law|5": { "acc": 0.47107438016528924, "acc_stderr": 0.04556710331269498, "acc_norm": 0.47107438016528924, "acc_norm_stderr": 0.04556710331269498 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.4444444444444444, "acc_stderr": 0.04803752235190193, "acc_norm": 0.4444444444444444, "acc_norm_stderr": 0.04803752235190193 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.3067484662576687, "acc_stderr": 0.036230899157241474, "acc_norm": 0.3067484662576687, "acc_norm_stderr": 0.036230899157241474 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.24107142857142858, "acc_stderr": 0.04059867246952687, "acc_norm": 0.24107142857142858, "acc_norm_stderr": 0.04059867246952687 }, "harness|hendrycksTest-management|5": { "acc": 0.3786407766990291, "acc_stderr": 0.04802694698258975, "acc_norm": 0.3786407766990291, "acc_norm_stderr": 0.04802694698258975 }, "harness|hendrycksTest-marketing|5": { "acc": 0.3717948717948718, "acc_stderr": 0.031660988918880785, "acc_norm": 0.3717948717948718, "acc_norm_stderr": 0.031660988918880785 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.37, "acc_stderr": 0.048523658709391, "acc_norm": 0.37, "acc_norm_stderr": 0.048523658709391 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.4648786717752235, "acc_stderr": 0.01783579880629064, "acc_norm": 0.4648786717752235, "acc_norm_stderr": 0.01783579880629064 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.2976878612716763, "acc_stderr": 0.024617055388677003, "acc_norm": 0.2976878612716763, "acc_norm_stderr": 0.024617055388677003 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.2435754189944134, "acc_stderr": 0.01435591196476786, "acc_norm": 0.2435754189944134, "acc_norm_stderr": 0.01435591196476786 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.3790849673202614, "acc_stderr": 0.027780141207023344, "acc_norm": 0.3790849673202614, "acc_norm_stderr": 0.027780141207023344 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.4180064308681672, "acc_stderr": 0.028013651891995072, "acc_norm": 0.4180064308681672, "acc_norm_stderr": 0.028013651891995072 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.3549382716049383, "acc_stderr": 0.026624152478845853, "acc_norm": 0.3549382716049383, "acc_norm_stderr": 0.026624152478845853 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.2765957446808511, "acc_stderr": 0.026684564340460997, "acc_norm": 0.2765957446808511, "acc_norm_stderr": 0.026684564340460997 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.2816166883963494, "acc_stderr": 0.011487783272786696, "acc_norm": 0.2816166883963494, "acc_norm_stderr": 0.011487783272786696 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.35294117647058826, "acc_stderr": 0.029029422815681397, "acc_norm": 0.35294117647058826, "acc_norm_stderr": 0.029029422815681397 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.3333333333333333, "acc_stderr": 0.019070985589687495, "acc_norm": 0.3333333333333333, "acc_norm_stderr": 0.019070985589687495 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.38181818181818183, "acc_stderr": 0.04653429807913508, "acc_norm": 0.38181818181818183, "acc_norm_stderr": 0.04653429807913508 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.3224489795918367, "acc_stderr": 0.029923100563683906, "acc_norm": 0.3224489795918367, "acc_norm_stderr": 0.029923100563683906 }, "harness|hendrycksTest-sociology|5": { "acc": 0.3034825870646766, "acc_stderr": 0.03251006816458618, "acc_norm": 0.3034825870646766, "acc_norm_stderr": 0.03251006816458618 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.5, "acc_stderr": 0.050251890762960605, "acc_norm": 0.5, "acc_norm_stderr": 0.050251890762960605 }, "harness|hendrycksTest-virology|5": { "acc": 0.3313253012048193, "acc_stderr": 0.036643147772880864, "acc_norm": 0.3313253012048193, "acc_norm_stderr": 0.036643147772880864 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.4269005847953216, "acc_stderr": 0.03793620616529917, "acc_norm": 0.4269005847953216, "acc_norm_stderr": 0.03793620616529917 }, "harness|truthfulqa:mc|0": { "mc1": 0.2533659730722154, "mc1_stderr": 0.015225899340826854, "mc2": 0.4028619731190418, "mc2_stderr": 0.01428278746898766 }, "harness|winogrande|5": { "acc": 0.654301499605367, "acc_stderr": 0.01336659695193438 }, "harness|gsm8k|5": { "acc": 0.1508718726307809, "acc_stderr": 0.009859004137305689 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_h2oai__h2o-danube-1.8b-sft
[ "region:us" ]
2024-02-01T22:57:14+00:00
{"pretty_name": "Evaluation run of h2oai/h2o-danube-1.8b-sft", "dataset_summary": "Dataset automatically created during the evaluation run of model [h2oai/h2o-danube-1.8b-sft](https://huggingface.co/h2oai/h2o-danube-1.8b-sft) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_h2oai__h2o-danube-1.8b-sft\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-01T22:54:49.142615](https://huggingface.co/datasets/open-llm-leaderboard/details_h2oai__h2o-danube-1.8b-sft/blob/main/results_2024-02-01T22-54-49.142615.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.3428374590678907,\n \"acc_stderr\": 0.033339599143861524,\n \"acc_norm\": 0.34426324885865267,\n \"acc_norm_stderr\": 0.03407406752430132,\n \"mc1\": 0.2533659730722154,\n \"mc1_stderr\": 0.015225899340826854,\n \"mc2\": 0.4028619731190418,\n \"mc2_stderr\": 0.01428278746898766\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.37372013651877134,\n \"acc_stderr\": 0.014137708601759098,\n \"acc_norm\": 0.40187713310580203,\n \"acc_norm_stderr\": 0.01432726861457827\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.49790878311093406,\n \"acc_stderr\": 0.004989737768749943,\n \"acc_norm\": 0.6733718382792272,\n \"acc_norm_stderr\": 0.004680215003395913\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847415,\n \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847415\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4222222222222222,\n \"acc_stderr\": 0.042667634040995814,\n \"acc_norm\": 0.4222222222222222,\n \"acc_norm_stderr\": 0.042667634040995814\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.2631578947368421,\n \"acc_stderr\": 0.03583496176361061,\n \"acc_norm\": 0.2631578947368421,\n \"acc_norm_stderr\": 0.03583496176361061\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.3886792452830189,\n \"acc_stderr\": 0.030000485448675986,\n \"acc_norm\": 0.3886792452830189,\n \"acc_norm_stderr\": 0.030000485448675986\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2847222222222222,\n \"acc_stderr\": 0.03773809990686934,\n \"acc_norm\": 0.2847222222222222,\n \"acc_norm_stderr\": 0.03773809990686934\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816506,\n \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816506\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909282,\n \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909282\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.32947976878612717,\n \"acc_stderr\": 0.03583901754736411,\n \"acc_norm\": 0.32947976878612717,\n \"acc_norm_stderr\": 0.03583901754736411\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.16666666666666666,\n \"acc_stderr\": 0.03708284662416545,\n \"acc_norm\": 0.16666666666666666,\n \"acc_norm_stderr\": 0.03708284662416545\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145632,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145632\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.24680851063829787,\n \"acc_stderr\": 0.028185441301234092,\n \"acc_norm\": 0.24680851063829787,\n \"acc_norm_stderr\": 0.028185441301234092\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2543859649122807,\n \"acc_stderr\": 0.040969851398436716,\n \"acc_norm\": 0.2543859649122807,\n \"acc_norm_stderr\": 0.040969851398436716\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.4206896551724138,\n \"acc_stderr\": 0.0411391498118926,\n \"acc_norm\": 0.4206896551724138,\n \"acc_norm_stderr\": 0.0411391498118926\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.25132275132275134,\n \"acc_stderr\": 0.022340482339643895,\n \"acc_norm\": 0.25132275132275134,\n \"acc_norm_stderr\": 0.022340482339643895\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.23809523809523808,\n \"acc_stderr\": 0.038095238095238106,\n \"acc_norm\": 0.23809523809523808,\n \"acc_norm_stderr\": 0.038095238095238106\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.4,\n \"acc_stderr\": 0.027869320571664635,\n \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.027869320571664635\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.24630541871921183,\n \"acc_stderr\": 0.030315099285617732,\n \"acc_norm\": 0.24630541871921183,\n \"acc_norm_stderr\": 0.030315099285617732\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.44242424242424244,\n \"acc_stderr\": 0.038783721137112745,\n \"acc_norm\": 0.44242424242424244,\n \"acc_norm_stderr\": 0.038783721137112745\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.4444444444444444,\n \"acc_stderr\": 0.035402943770953675,\n \"acc_norm\": 0.4444444444444444,\n \"acc_norm_stderr\": 0.035402943770953675\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.41450777202072536,\n \"acc_stderr\": 0.03555300319557673,\n \"acc_norm\": 0.41450777202072536,\n \"acc_norm_stderr\": 0.03555300319557673\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.30256410256410254,\n \"acc_stderr\": 0.023290888053772735,\n \"acc_norm\": 0.30256410256410254,\n \"acc_norm_stderr\": 0.023290888053772735\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.2777777777777778,\n \"acc_stderr\": 0.02730914058823018,\n \"acc_norm\": 0.2777777777777778,\n \"acc_norm_stderr\": 0.02730914058823018\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.2815126050420168,\n \"acc_stderr\": 0.029213549414372153,\n \"acc_norm\": 0.2815126050420168,\n \"acc_norm_stderr\": 0.029213549414372153\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.2582781456953642,\n \"acc_stderr\": 0.035737053147634576,\n \"acc_norm\": 0.2582781456953642,\n \"acc_norm_stderr\": 0.035737053147634576\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.344954128440367,\n \"acc_stderr\": 0.020380605405066962,\n \"acc_norm\": 0.344954128440367,\n \"acc_norm_stderr\": 0.020380605405066962\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.30092592592592593,\n \"acc_stderr\": 0.031280390843298825,\n \"acc_norm\": 0.30092592592592593,\n \"acc_norm_stderr\": 0.031280390843298825\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.03308611113236436,\n \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.03308611113236436\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.4430379746835443,\n \"acc_stderr\": 0.03233532777533484,\n \"acc_norm\": 0.4430379746835443,\n \"acc_norm_stderr\": 0.03233532777533484\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.3811659192825112,\n \"acc_stderr\": 0.03259625118416828,\n \"acc_norm\": 0.3811659192825112,\n \"acc_norm_stderr\": 0.03259625118416828\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.366412213740458,\n \"acc_stderr\": 0.04225875451969638,\n \"acc_norm\": 0.366412213740458,\n \"acc_norm_stderr\": 0.04225875451969638\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.47107438016528924,\n \"acc_stderr\": 0.04556710331269498,\n \"acc_norm\": 0.47107438016528924,\n \"acc_norm_stderr\": 0.04556710331269498\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.4444444444444444,\n \"acc_stderr\": 0.04803752235190193,\n \"acc_norm\": 0.4444444444444444,\n \"acc_norm_stderr\": 0.04803752235190193\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.3067484662576687,\n \"acc_stderr\": 0.036230899157241474,\n \"acc_norm\": 0.3067484662576687,\n \"acc_norm_stderr\": 0.036230899157241474\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.24107142857142858,\n \"acc_stderr\": 0.04059867246952687,\n \"acc_norm\": 0.24107142857142858,\n \"acc_norm_stderr\": 0.04059867246952687\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.3786407766990291,\n \"acc_stderr\": 0.04802694698258975,\n \"acc_norm\": 0.3786407766990291,\n \"acc_norm_stderr\": 0.04802694698258975\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.3717948717948718,\n \"acc_stderr\": 0.031660988918880785,\n \"acc_norm\": 0.3717948717948718,\n \"acc_norm_stderr\": 0.031660988918880785\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.4648786717752235,\n \"acc_stderr\": 0.01783579880629064,\n \"acc_norm\": 0.4648786717752235,\n \"acc_norm_stderr\": 0.01783579880629064\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.2976878612716763,\n \"acc_stderr\": 0.024617055388677003,\n \"acc_norm\": 0.2976878612716763,\n \"acc_norm_stderr\": 0.024617055388677003\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2435754189944134,\n \"acc_stderr\": 0.01435591196476786,\n \"acc_norm\": 0.2435754189944134,\n \"acc_norm_stderr\": 0.01435591196476786\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.3790849673202614,\n \"acc_stderr\": 0.027780141207023344,\n \"acc_norm\": 0.3790849673202614,\n \"acc_norm_stderr\": 0.027780141207023344\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.4180064308681672,\n \"acc_stderr\": 0.028013651891995072,\n \"acc_norm\": 0.4180064308681672,\n \"acc_norm_stderr\": 0.028013651891995072\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.3549382716049383,\n \"acc_stderr\": 0.026624152478845853,\n \"acc_norm\": 0.3549382716049383,\n \"acc_norm_stderr\": 0.026624152478845853\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.2765957446808511,\n \"acc_stderr\": 0.026684564340460997,\n \"acc_norm\": 0.2765957446808511,\n \"acc_norm_stderr\": 0.026684564340460997\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2816166883963494,\n \"acc_stderr\": 0.011487783272786696,\n \"acc_norm\": 0.2816166883963494,\n \"acc_norm_stderr\": 0.011487783272786696\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.35294117647058826,\n \"acc_stderr\": 0.029029422815681397,\n \"acc_norm\": 0.35294117647058826,\n \"acc_norm_stderr\": 0.029029422815681397\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.019070985589687495,\n \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.019070985589687495\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.38181818181818183,\n \"acc_stderr\": 0.04653429807913508,\n \"acc_norm\": 0.38181818181818183,\n \"acc_norm_stderr\": 0.04653429807913508\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.3224489795918367,\n \"acc_stderr\": 0.029923100563683906,\n \"acc_norm\": 0.3224489795918367,\n \"acc_norm_stderr\": 0.029923100563683906\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.3034825870646766,\n \"acc_stderr\": 0.03251006816458618,\n \"acc_norm\": 0.3034825870646766,\n \"acc_norm_stderr\": 0.03251006816458618\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3313253012048193,\n \"acc_stderr\": 0.036643147772880864,\n \"acc_norm\": 0.3313253012048193,\n \"acc_norm_stderr\": 0.036643147772880864\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.4269005847953216,\n \"acc_stderr\": 0.03793620616529917,\n \"acc_norm\": 0.4269005847953216,\n \"acc_norm_stderr\": 0.03793620616529917\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2533659730722154,\n \"mc1_stderr\": 0.015225899340826854,\n \"mc2\": 0.4028619731190418,\n \"mc2_stderr\": 0.01428278746898766\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.654301499605367,\n \"acc_stderr\": 0.01336659695193438\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.1508718726307809,\n \"acc_stderr\": 0.009859004137305689\n }\n}\n```", "repo_url": "https://huggingface.co/h2oai/h2o-danube-1.8b-sft", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_01T22_54_49.142615", "path": ["**/details_harness|arc:challenge|25_2024-02-01T22-54-49.142615.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-01T22-54-49.142615.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_01T22_54_49.142615", "path": ["**/details_harness|gsm8k|5_2024-02-01T22-54-49.142615.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-01T22-54-49.142615.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_01T22_54_49.142615", "path": ["**/details_harness|hellaswag|10_2024-02-01T22-54-49.142615.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-01T22-54-49.142615.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_01T22_54_49.142615", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T22-54-49.142615.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-01T22-54-49.142615.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-01T22-54-49.142615.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T22-54-49.142615.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T22-54-49.142615.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-01T22-54-49.142615.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T22-54-49.142615.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T22-54-49.142615.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T22-54-49.142615.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T22-54-49.142615.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-01T22-54-49.142615.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-01T22-54-49.142615.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T22-54-49.142615.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-01T22-54-49.142615.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T22-54-49.142615.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T22-54-49.142615.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T22-54-49.142615.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-01T22-54-49.142615.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T22-54-49.142615.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T22-54-49.142615.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T22-54-49.142615.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T22-54-49.142615.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T22-54-49.142615.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T22-54-49.142615.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T22-54-49.142615.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T22-54-49.142615.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T22-54-49.142615.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T22-54-49.142615.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T22-54-49.142615.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T22-54-49.142615.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T22-54-49.142615.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T22-54-49.142615.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-01T22-54-49.142615.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T22-54-49.142615.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-01T22-54-49.142615.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T22-54-49.142615.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T22-54-49.142615.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T22-54-49.142615.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-01T22-54-49.142615.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-01T22-54-49.142615.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T22-54-49.142615.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T22-54-49.142615.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T22-54-49.142615.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T22-54-49.142615.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-01T22-54-49.142615.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-01T22-54-49.142615.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-01T22-54-49.142615.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T22-54-49.142615.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-01T22-54-49.142615.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T22-54-49.142615.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T22-54-49.142615.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-01T22-54-49.142615.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-01T22-54-49.142615.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-01T22-54-49.142615.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T22-54-49.142615.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-01T22-54-49.142615.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-01T22-54-49.142615.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T22-54-49.142615.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-01T22-54-49.142615.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-01T22-54-49.142615.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T22-54-49.142615.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T22-54-49.142615.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-01T22-54-49.142615.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T22-54-49.142615.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T22-54-49.142615.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T22-54-49.142615.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T22-54-49.142615.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-01T22-54-49.142615.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-01T22-54-49.142615.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T22-54-49.142615.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-01T22-54-49.142615.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T22-54-49.142615.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T22-54-49.142615.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T22-54-49.142615.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-01T22-54-49.142615.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T22-54-49.142615.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T22-54-49.142615.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T22-54-49.142615.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T22-54-49.142615.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T22-54-49.142615.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T22-54-49.142615.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T22-54-49.142615.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T22-54-49.142615.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T22-54-49.142615.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T22-54-49.142615.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T22-54-49.142615.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T22-54-49.142615.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T22-54-49.142615.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T22-54-49.142615.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-01T22-54-49.142615.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T22-54-49.142615.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-01T22-54-49.142615.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T22-54-49.142615.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T22-54-49.142615.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T22-54-49.142615.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-01T22-54-49.142615.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-01T22-54-49.142615.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T22-54-49.142615.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T22-54-49.142615.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T22-54-49.142615.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T22-54-49.142615.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-01T22-54-49.142615.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-01T22-54-49.142615.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-01T22-54-49.142615.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T22-54-49.142615.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-01T22-54-49.142615.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T22-54-49.142615.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T22-54-49.142615.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-01T22-54-49.142615.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-01T22-54-49.142615.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-01T22-54-49.142615.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T22-54-49.142615.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-01T22-54-49.142615.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-01T22-54-49.142615.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_01T22_54_49.142615", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T22-54-49.142615.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T22-54-49.142615.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_01T22_54_49.142615", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-01T22-54-49.142615.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-01T22-54-49.142615.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_01T22_54_49.142615", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-01T22-54-49.142615.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-01T22-54-49.142615.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_01T22_54_49.142615", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T22-54-49.142615.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T22-54-49.142615.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_01T22_54_49.142615", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T22-54-49.142615.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T22-54-49.142615.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_01T22_54_49.142615", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-01T22-54-49.142615.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-01T22-54-49.142615.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_01T22_54_49.142615", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T22-54-49.142615.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T22-54-49.142615.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_01T22_54_49.142615", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T22-54-49.142615.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T22-54-49.142615.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_01T22_54_49.142615", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T22-54-49.142615.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T22-54-49.142615.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_01T22_54_49.142615", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T22-54-49.142615.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T22-54-49.142615.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_01T22_54_49.142615", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-01T22-54-49.142615.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-01T22-54-49.142615.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_01T22_54_49.142615", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-01T22-54-49.142615.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-01T22-54-49.142615.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_01T22_54_49.142615", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T22-54-49.142615.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T22-54-49.142615.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_01T22_54_49.142615", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-01T22-54-49.142615.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-01T22-54-49.142615.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_01T22_54_49.142615", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T22-54-49.142615.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T22-54-49.142615.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_01T22_54_49.142615", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T22-54-49.142615.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T22-54-49.142615.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_01T22_54_49.142615", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T22-54-49.142615.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T22-54-49.142615.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_01T22_54_49.142615", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-01T22-54-49.142615.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-01T22-54-49.142615.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_01T22_54_49.142615", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T22-54-49.142615.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T22-54-49.142615.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_01T22_54_49.142615", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T22-54-49.142615.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T22-54-49.142615.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_01T22_54_49.142615", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T22-54-49.142615.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T22-54-49.142615.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_01T22_54_49.142615", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T22-54-49.142615.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T22-54-49.142615.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_01T22_54_49.142615", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T22-54-49.142615.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T22-54-49.142615.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_01T22_54_49.142615", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T22-54-49.142615.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T22-54-49.142615.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_01T22_54_49.142615", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T22-54-49.142615.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T22-54-49.142615.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_01T22_54_49.142615", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T22-54-49.142615.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T22-54-49.142615.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_01T22_54_49.142615", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T22-54-49.142615.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T22-54-49.142615.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_01T22_54_49.142615", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T22-54-49.142615.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T22-54-49.142615.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_01T22_54_49.142615", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T22-54-49.142615.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T22-54-49.142615.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_01T22_54_49.142615", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T22-54-49.142615.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T22-54-49.142615.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_01T22_54_49.142615", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T22-54-49.142615.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T22-54-49.142615.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_01T22_54_49.142615", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T22-54-49.142615.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T22-54-49.142615.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_01T22_54_49.142615", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-01T22-54-49.142615.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-01T22-54-49.142615.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_01T22_54_49.142615", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T22-54-49.142615.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T22-54-49.142615.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_01T22_54_49.142615", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-01T22-54-49.142615.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-01T22-54-49.142615.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_01T22_54_49.142615", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T22-54-49.142615.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T22-54-49.142615.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_01T22_54_49.142615", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T22-54-49.142615.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T22-54-49.142615.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_01T22_54_49.142615", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T22-54-49.142615.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T22-54-49.142615.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_01T22_54_49.142615", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-01T22-54-49.142615.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-01T22-54-49.142615.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_01T22_54_49.142615", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-01T22-54-49.142615.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-01T22-54-49.142615.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_01T22_54_49.142615", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T22-54-49.142615.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T22-54-49.142615.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_01T22_54_49.142615", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T22-54-49.142615.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T22-54-49.142615.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_01T22_54_49.142615", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T22-54-49.142615.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T22-54-49.142615.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_01T22_54_49.142615", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T22-54-49.142615.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T22-54-49.142615.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_01T22_54_49.142615", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-01T22-54-49.142615.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-01T22-54-49.142615.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_01T22_54_49.142615", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-01T22-54-49.142615.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-01T22-54-49.142615.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_01T22_54_49.142615", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-01T22-54-49.142615.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-01T22-54-49.142615.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_01T22_54_49.142615", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T22-54-49.142615.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T22-54-49.142615.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_01T22_54_49.142615", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-01T22-54-49.142615.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-01T22-54-49.142615.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_01T22_54_49.142615", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T22-54-49.142615.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T22-54-49.142615.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_01T22_54_49.142615", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T22-54-49.142615.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T22-54-49.142615.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_01T22_54_49.142615", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-01T22-54-49.142615.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-01T22-54-49.142615.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_01T22_54_49.142615", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-01T22-54-49.142615.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-01T22-54-49.142615.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_01T22_54_49.142615", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-01T22-54-49.142615.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-01T22-54-49.142615.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_01T22_54_49.142615", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T22-54-49.142615.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T22-54-49.142615.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_01T22_54_49.142615", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-01T22-54-49.142615.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-01T22-54-49.142615.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_01T22_54_49.142615", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-01T22-54-49.142615.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-01T22-54-49.142615.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_01T22_54_49.142615", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-01T22-54-49.142615.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-01T22-54-49.142615.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_01T22_54_49.142615", "path": ["**/details_harness|winogrande|5_2024-02-01T22-54-49.142615.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-01T22-54-49.142615.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_01T22_54_49.142615", "path": ["results_2024-02-01T22-54-49.142615.parquet"]}, {"split": "latest", "path": ["results_2024-02-01T22-54-49.142615.parquet"]}]}]}
2024-02-01T22:57:38+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of h2oai/h2o-danube-1.8b-sft Dataset automatically created during the evaluation run of model h2oai/h2o-danube-1.8b-sft on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-02-01T22:54:49.142615(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of h2oai/h2o-danube-1.8b-sft\n\n\n\nDataset automatically created during the evaluation run of model h2oai/h2o-danube-1.8b-sft on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-02-01T22:54:49.142615(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of h2oai/h2o-danube-1.8b-sft\n\n\n\nDataset automatically created during the evaluation run of model h2oai/h2o-danube-1.8b-sft on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-02-01T22:54:49.142615(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
159ff0fc3ecd00e0127821a8c3ddd886ec077c0d
# Dataset Card for Evaluation run of vanillaOVO/supermario_v4 <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [vanillaOVO/supermario_v4](https://huggingface.co/vanillaOVO/supermario_v4) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_vanillaOVO__supermario_v4", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-02-01T22:55:06.227389](https://huggingface.co/datasets/open-llm-leaderboard/details_vanillaOVO__supermario_v4/blob/main/results_2024-02-01T22-55-06.227389.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6599989129866183, "acc_stderr": 0.03192841805798971, "acc_norm": 0.6593861923643444, "acc_norm_stderr": 0.03259944262143704, "mc1": 0.576499388004896, "mc1_stderr": 0.017297421448534744, "mc2": 0.7206547057471042, "mc2_stderr": 0.014737356055250207 }, "harness|arc:challenge|25": { "acc": 0.712457337883959, "acc_stderr": 0.013226719056266129, "acc_norm": 0.734641638225256, "acc_norm_stderr": 0.012902554762313957 }, "harness|hellaswag|10": { "acc": 0.7123083051185023, "acc_stderr": 0.004517614647703243, "acc_norm": 0.8876717785301733, "acc_norm_stderr": 0.003151244960241657 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.35, "acc_stderr": 0.0479372485441102, "acc_norm": 0.35, "acc_norm_stderr": 0.0479372485441102 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.674074074074074, "acc_stderr": 0.040491220417025055, "acc_norm": 0.674074074074074, "acc_norm_stderr": 0.040491220417025055 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.7105263157894737, "acc_stderr": 0.03690677986137283, "acc_norm": 0.7105263157894737, "acc_norm_stderr": 0.03690677986137283 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.64, "acc_stderr": 0.04824181513244218, "acc_norm": 0.64, "acc_norm_stderr": 0.04824181513244218 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.7132075471698113, "acc_stderr": 0.027834912527544064, "acc_norm": 0.7132075471698113, "acc_norm_stderr": 0.027834912527544064 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.7847222222222222, "acc_stderr": 0.03437079344106135, "acc_norm": 0.7847222222222222, "acc_norm_stderr": 0.03437079344106135 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.5, "acc_stderr": 0.050251890762960605, "acc_norm": 0.5, "acc_norm_stderr": 0.050251890762960605 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.56, "acc_stderr": 0.049888765156985884, "acc_norm": 0.56, "acc_norm_stderr": 0.049888765156985884 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.33, "acc_stderr": 0.04725815626252604, "acc_norm": 0.33, "acc_norm_stderr": 0.04725815626252604 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6647398843930635, "acc_stderr": 0.03599586301247077, "acc_norm": 0.6647398843930635, "acc_norm_stderr": 0.03599586301247077 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.39215686274509803, "acc_stderr": 0.048580835742663454, "acc_norm": 0.39215686274509803, "acc_norm_stderr": 0.048580835742663454 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.76, "acc_stderr": 0.04292346959909284, "acc_norm": 0.76, "acc_norm_stderr": 0.04292346959909284 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.5787234042553191, "acc_stderr": 0.03227834510146268, "acc_norm": 0.5787234042553191, "acc_norm_stderr": 0.03227834510146268 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.47368421052631576, "acc_stderr": 0.046970851366478626, "acc_norm": 0.47368421052631576, "acc_norm_stderr": 0.046970851366478626 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5448275862068965, "acc_stderr": 0.04149886942192117, "acc_norm": 0.5448275862068965, "acc_norm_stderr": 0.04149886942192117 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.41798941798941797, "acc_stderr": 0.025402555503260912, "acc_norm": 0.41798941798941797, "acc_norm_stderr": 0.025402555503260912 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.5158730158730159, "acc_stderr": 0.044698818540726076, "acc_norm": 0.5158730158730159, "acc_norm_stderr": 0.044698818540726076 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.33, "acc_stderr": 0.047258156262526045, "acc_norm": 0.33, "acc_norm_stderr": 0.047258156262526045 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.7935483870967742, "acc_stderr": 0.023025899617188716, "acc_norm": 0.7935483870967742, "acc_norm_stderr": 0.023025899617188716 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.5172413793103449, "acc_stderr": 0.035158955511656986, "acc_norm": 0.5172413793103449, "acc_norm_stderr": 0.035158955511656986 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.7, "acc_stderr": 0.046056618647183814, "acc_norm": 0.7, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7757575757575758, "acc_stderr": 0.032568666616811015, "acc_norm": 0.7757575757575758, "acc_norm_stderr": 0.032568666616811015 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.797979797979798, "acc_stderr": 0.028606204289229872, "acc_norm": 0.797979797979798, "acc_norm_stderr": 0.028606204289229872 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.9015544041450777, "acc_stderr": 0.021500249576033456, "acc_norm": 0.9015544041450777, "acc_norm_stderr": 0.021500249576033456 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.6717948717948717, "acc_stderr": 0.023807633198657266, "acc_norm": 0.6717948717948717, "acc_norm_stderr": 0.023807633198657266 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.34074074074074073, "acc_stderr": 0.02889774874113115, "acc_norm": 0.34074074074074073, "acc_norm_stderr": 0.02889774874113115 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.6890756302521008, "acc_stderr": 0.03006676158297794, "acc_norm": 0.6890756302521008, "acc_norm_stderr": 0.03006676158297794 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.3841059602649007, "acc_stderr": 0.03971301814719197, "acc_norm": 0.3841059602649007, "acc_norm_stderr": 0.03971301814719197 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8513761467889909, "acc_stderr": 0.015251253773660834, "acc_norm": 0.8513761467889909, "acc_norm_stderr": 0.015251253773660834 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.5092592592592593, "acc_stderr": 0.034093869469927006, "acc_norm": 0.5092592592592593, "acc_norm_stderr": 0.034093869469927006 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.8382352941176471, "acc_stderr": 0.025845017986926917, "acc_norm": 0.8382352941176471, "acc_norm_stderr": 0.025845017986926917 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.8059071729957806, "acc_stderr": 0.025744902532290913, "acc_norm": 0.8059071729957806, "acc_norm_stderr": 0.025744902532290913 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6816143497757847, "acc_stderr": 0.03126580522513713, "acc_norm": 0.6816143497757847, "acc_norm_stderr": 0.03126580522513713 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.8091603053435115, "acc_stderr": 0.03446513350752598, "acc_norm": 0.8091603053435115, "acc_norm_stderr": 0.03446513350752598 }, "harness|hendrycksTest-international_law|5": { "acc": 0.7851239669421488, "acc_stderr": 0.037494924487096966, "acc_norm": 0.7851239669421488, "acc_norm_stderr": 0.037494924487096966 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7870370370370371, "acc_stderr": 0.0395783547198098, "acc_norm": 0.7870370370370371, "acc_norm_stderr": 0.0395783547198098 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7730061349693251, "acc_stderr": 0.03291099578615769, "acc_norm": 0.7730061349693251, "acc_norm_stderr": 0.03291099578615769 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.4732142857142857, "acc_stderr": 0.047389751192741546, "acc_norm": 0.4732142857142857, "acc_norm_stderr": 0.047389751192741546 }, "harness|hendrycksTest-management|5": { "acc": 0.7669902912621359, "acc_stderr": 0.04185832598928315, "acc_norm": 0.7669902912621359, "acc_norm_stderr": 0.04185832598928315 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8888888888888888, "acc_stderr": 0.020588491316092368, "acc_norm": 0.8888888888888888, "acc_norm_stderr": 0.020588491316092368 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.73, "acc_stderr": 0.044619604333847394, "acc_norm": 0.73, "acc_norm_stderr": 0.044619604333847394 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8275862068965517, "acc_stderr": 0.013507943909371805, "acc_norm": 0.8275862068965517, "acc_norm_stderr": 0.013507943909371805 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.7485549132947977, "acc_stderr": 0.02335736578587403, "acc_norm": 0.7485549132947977, "acc_norm_stderr": 0.02335736578587403 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.43798882681564244, "acc_stderr": 0.016593394227564843, "acc_norm": 0.43798882681564244, "acc_norm_stderr": 0.016593394227564843 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.7320261437908496, "acc_stderr": 0.025360603796242557, "acc_norm": 0.7320261437908496, "acc_norm_stderr": 0.025360603796242557 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.7266881028938906, "acc_stderr": 0.025311765975426122, "acc_norm": 0.7266881028938906, "acc_norm_stderr": 0.025311765975426122 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.75, "acc_stderr": 0.02409347123262133, "acc_norm": 0.75, "acc_norm_stderr": 0.02409347123262133 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.48226950354609927, "acc_stderr": 0.02980873964223777, "acc_norm": 0.48226950354609927, "acc_norm_stderr": 0.02980873964223777 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.47522816166883963, "acc_stderr": 0.012754553719781753, "acc_norm": 0.47522816166883963, "acc_norm_stderr": 0.012754553719781753 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.6838235294117647, "acc_stderr": 0.028245687391462927, "acc_norm": 0.6838235294117647, "acc_norm_stderr": 0.028245687391462927 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.673202614379085, "acc_stderr": 0.018975427920507208, "acc_norm": 0.673202614379085, "acc_norm_stderr": 0.018975427920507208 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.7, "acc_stderr": 0.04389311454644286, "acc_norm": 0.7, "acc_norm_stderr": 0.04389311454644286 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.7387755102040816, "acc_stderr": 0.028123429335142783, "acc_norm": 0.7387755102040816, "acc_norm_stderr": 0.028123429335142783 }, "harness|hendrycksTest-sociology|5": { "acc": 0.8308457711442786, "acc_stderr": 0.026508590656233268, "acc_norm": 0.8308457711442786, "acc_norm_stderr": 0.026508590656233268 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.86, "acc_stderr": 0.0348735088019777, "acc_norm": 0.86, "acc_norm_stderr": 0.0348735088019777 }, "harness|hendrycksTest-virology|5": { "acc": 0.5602409638554217, "acc_stderr": 0.03864139923699122, "acc_norm": 0.5602409638554217, "acc_norm_stderr": 0.03864139923699122 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8421052631578947, "acc_stderr": 0.027966785859160893, "acc_norm": 0.8421052631578947, "acc_norm_stderr": 0.027966785859160893 }, "harness|truthfulqa:mc|0": { "mc1": 0.576499388004896, "mc1_stderr": 0.017297421448534744, "mc2": 0.7206547057471042, "mc2_stderr": 0.014737356055250207 }, "harness|winogrande|5": { "acc": 0.8524072612470402, "acc_stderr": 0.009968715765479646 }, "harness|gsm8k|5": { "acc": 0.7012888551933283, "acc_stderr": 0.012607137125693633 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_vanillaOVO__supermario_v4
[ "region:us" ]
2024-02-01T22:57:24+00:00
{"pretty_name": "Evaluation run of vanillaOVO/supermario_v4", "dataset_summary": "Dataset automatically created during the evaluation run of model [vanillaOVO/supermario_v4](https://huggingface.co/vanillaOVO/supermario_v4) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_vanillaOVO__supermario_v4\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-01T22:55:06.227389](https://huggingface.co/datasets/open-llm-leaderboard/details_vanillaOVO__supermario_v4/blob/main/results_2024-02-01T22-55-06.227389.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6599989129866183,\n \"acc_stderr\": 0.03192841805798971,\n \"acc_norm\": 0.6593861923643444,\n \"acc_norm_stderr\": 0.03259944262143704,\n \"mc1\": 0.576499388004896,\n \"mc1_stderr\": 0.017297421448534744,\n \"mc2\": 0.7206547057471042,\n \"mc2_stderr\": 0.014737356055250207\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.712457337883959,\n \"acc_stderr\": 0.013226719056266129,\n \"acc_norm\": 0.734641638225256,\n \"acc_norm_stderr\": 0.012902554762313957\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7123083051185023,\n \"acc_stderr\": 0.004517614647703243,\n \"acc_norm\": 0.8876717785301733,\n \"acc_norm_stderr\": 0.003151244960241657\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.674074074074074,\n \"acc_stderr\": 0.040491220417025055,\n \"acc_norm\": 0.674074074074074,\n \"acc_norm_stderr\": 0.040491220417025055\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7105263157894737,\n \"acc_stderr\": 0.03690677986137283,\n \"acc_norm\": 0.7105263157894737,\n \"acc_norm_stderr\": 0.03690677986137283\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7132075471698113,\n \"acc_stderr\": 0.027834912527544064,\n \"acc_norm\": 0.7132075471698113,\n \"acc_norm_stderr\": 0.027834912527544064\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7847222222222222,\n \"acc_stderr\": 0.03437079344106135,\n \"acc_norm\": 0.7847222222222222,\n \"acc_norm_stderr\": 0.03437079344106135\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.56,\n \"acc_stderr\": 0.049888765156985884,\n \"acc_norm\": 0.56,\n \"acc_norm_stderr\": 0.049888765156985884\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6647398843930635,\n \"acc_stderr\": 0.03599586301247077,\n \"acc_norm\": 0.6647398843930635,\n \"acc_norm_stderr\": 0.03599586301247077\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.048580835742663454,\n \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.048580835742663454\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.04292346959909284,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.04292346959909284\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5787234042553191,\n \"acc_stderr\": 0.03227834510146268,\n \"acc_norm\": 0.5787234042553191,\n \"acc_norm_stderr\": 0.03227834510146268\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.47368421052631576,\n \"acc_stderr\": 0.046970851366478626,\n \"acc_norm\": 0.47368421052631576,\n \"acc_norm_stderr\": 0.046970851366478626\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5448275862068965,\n \"acc_stderr\": 0.04149886942192117,\n \"acc_norm\": 0.5448275862068965,\n \"acc_norm_stderr\": 0.04149886942192117\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.41798941798941797,\n \"acc_stderr\": 0.025402555503260912,\n \"acc_norm\": 0.41798941798941797,\n \"acc_norm_stderr\": 0.025402555503260912\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5158730158730159,\n \"acc_stderr\": 0.044698818540726076,\n \"acc_norm\": 0.5158730158730159,\n \"acc_norm_stderr\": 0.044698818540726076\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7935483870967742,\n \"acc_stderr\": 0.023025899617188716,\n \"acc_norm\": 0.7935483870967742,\n \"acc_norm_stderr\": 0.023025899617188716\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.035158955511656986,\n \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.035158955511656986\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.032568666616811015,\n \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.032568666616811015\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.797979797979798,\n \"acc_stderr\": 0.028606204289229872,\n \"acc_norm\": 0.797979797979798,\n \"acc_norm_stderr\": 0.028606204289229872\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9015544041450777,\n \"acc_stderr\": 0.021500249576033456,\n \"acc_norm\": 0.9015544041450777,\n \"acc_norm_stderr\": 0.021500249576033456\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6717948717948717,\n \"acc_stderr\": 0.023807633198657266,\n \"acc_norm\": 0.6717948717948717,\n \"acc_norm_stderr\": 0.023807633198657266\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.34074074074074073,\n \"acc_stderr\": 0.02889774874113115,\n \"acc_norm\": 0.34074074074074073,\n \"acc_norm_stderr\": 0.02889774874113115\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6890756302521008,\n \"acc_stderr\": 0.03006676158297794,\n \"acc_norm\": 0.6890756302521008,\n \"acc_norm_stderr\": 0.03006676158297794\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3841059602649007,\n \"acc_stderr\": 0.03971301814719197,\n \"acc_norm\": 0.3841059602649007,\n \"acc_norm_stderr\": 0.03971301814719197\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8513761467889909,\n \"acc_stderr\": 0.015251253773660834,\n \"acc_norm\": 0.8513761467889909,\n \"acc_norm_stderr\": 0.015251253773660834\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5092592592592593,\n \"acc_stderr\": 0.034093869469927006,\n \"acc_norm\": 0.5092592592592593,\n \"acc_norm_stderr\": 0.034093869469927006\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8382352941176471,\n \"acc_stderr\": 0.025845017986926917,\n \"acc_norm\": 0.8382352941176471,\n \"acc_norm_stderr\": 0.025845017986926917\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8059071729957806,\n \"acc_stderr\": 0.025744902532290913,\n \"acc_norm\": 0.8059071729957806,\n \"acc_norm_stderr\": 0.025744902532290913\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6816143497757847,\n \"acc_stderr\": 0.03126580522513713,\n \"acc_norm\": 0.6816143497757847,\n \"acc_norm_stderr\": 0.03126580522513713\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8091603053435115,\n \"acc_stderr\": 0.03446513350752598,\n \"acc_norm\": 0.8091603053435115,\n \"acc_norm_stderr\": 0.03446513350752598\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n \"acc_stderr\": 0.0395783547198098,\n \"acc_norm\": 0.7870370370370371,\n \"acc_norm_stderr\": 0.0395783547198098\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7730061349693251,\n \"acc_stderr\": 0.03291099578615769,\n \"acc_norm\": 0.7730061349693251,\n \"acc_norm_stderr\": 0.03291099578615769\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4732142857142857,\n \"acc_stderr\": 0.047389751192741546,\n \"acc_norm\": 0.4732142857142857,\n \"acc_norm_stderr\": 0.047389751192741546\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8888888888888888,\n \"acc_stderr\": 0.020588491316092368,\n \"acc_norm\": 0.8888888888888888,\n \"acc_norm_stderr\": 0.020588491316092368\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8275862068965517,\n \"acc_stderr\": 0.013507943909371805,\n \"acc_norm\": 0.8275862068965517,\n \"acc_norm_stderr\": 0.013507943909371805\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7485549132947977,\n \"acc_stderr\": 0.02335736578587403,\n \"acc_norm\": 0.7485549132947977,\n \"acc_norm_stderr\": 0.02335736578587403\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.43798882681564244,\n \"acc_stderr\": 0.016593394227564843,\n \"acc_norm\": 0.43798882681564244,\n \"acc_norm_stderr\": 0.016593394227564843\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7320261437908496,\n \"acc_stderr\": 0.025360603796242557,\n \"acc_norm\": 0.7320261437908496,\n \"acc_norm_stderr\": 0.025360603796242557\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7266881028938906,\n \"acc_stderr\": 0.025311765975426122,\n \"acc_norm\": 0.7266881028938906,\n \"acc_norm_stderr\": 0.025311765975426122\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.02409347123262133,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.02409347123262133\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.48226950354609927,\n \"acc_stderr\": 0.02980873964223777,\n \"acc_norm\": 0.48226950354609927,\n \"acc_norm_stderr\": 0.02980873964223777\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.47522816166883963,\n \"acc_stderr\": 0.012754553719781753,\n \"acc_norm\": 0.47522816166883963,\n \"acc_norm_stderr\": 0.012754553719781753\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6838235294117647,\n \"acc_stderr\": 0.028245687391462927,\n \"acc_norm\": 0.6838235294117647,\n \"acc_norm_stderr\": 0.028245687391462927\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.673202614379085,\n \"acc_stderr\": 0.018975427920507208,\n \"acc_norm\": 0.673202614379085,\n \"acc_norm_stderr\": 0.018975427920507208\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.04389311454644286,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.04389311454644286\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7387755102040816,\n \"acc_stderr\": 0.028123429335142783,\n \"acc_norm\": 0.7387755102040816,\n \"acc_norm_stderr\": 0.028123429335142783\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8308457711442786,\n \"acc_stderr\": 0.026508590656233268,\n \"acc_norm\": 0.8308457711442786,\n \"acc_norm_stderr\": 0.026508590656233268\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.86,\n \"acc_stderr\": 0.0348735088019777,\n \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.0348735088019777\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5602409638554217,\n \"acc_stderr\": 0.03864139923699122,\n \"acc_norm\": 0.5602409638554217,\n \"acc_norm_stderr\": 0.03864139923699122\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8421052631578947,\n \"acc_stderr\": 0.027966785859160893,\n \"acc_norm\": 0.8421052631578947,\n \"acc_norm_stderr\": 0.027966785859160893\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.576499388004896,\n \"mc1_stderr\": 0.017297421448534744,\n \"mc2\": 0.7206547057471042,\n \"mc2_stderr\": 0.014737356055250207\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8524072612470402,\n \"acc_stderr\": 0.009968715765479646\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7012888551933283,\n \"acc_stderr\": 0.012607137125693633\n }\n}\n```", "repo_url": "https://huggingface.co/vanillaOVO/supermario_v4", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_01T22_55_06.227389", "path": ["**/details_harness|arc:challenge|25_2024-02-01T22-55-06.227389.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-01T22-55-06.227389.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_01T22_55_06.227389", "path": ["**/details_harness|gsm8k|5_2024-02-01T22-55-06.227389.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-01T22-55-06.227389.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_01T22_55_06.227389", "path": ["**/details_harness|hellaswag|10_2024-02-01T22-55-06.227389.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-01T22-55-06.227389.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_01T22_55_06.227389", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T22-55-06.227389.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-01T22-55-06.227389.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-01T22-55-06.227389.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T22-55-06.227389.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T22-55-06.227389.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-01T22-55-06.227389.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T22-55-06.227389.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T22-55-06.227389.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T22-55-06.227389.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T22-55-06.227389.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-01T22-55-06.227389.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-01T22-55-06.227389.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T22-55-06.227389.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-01T22-55-06.227389.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T22-55-06.227389.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T22-55-06.227389.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T22-55-06.227389.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-01T22-55-06.227389.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T22-55-06.227389.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T22-55-06.227389.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T22-55-06.227389.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T22-55-06.227389.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T22-55-06.227389.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T22-55-06.227389.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T22-55-06.227389.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T22-55-06.227389.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T22-55-06.227389.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T22-55-06.227389.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T22-55-06.227389.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T22-55-06.227389.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T22-55-06.227389.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T22-55-06.227389.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-01T22-55-06.227389.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T22-55-06.227389.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-01T22-55-06.227389.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T22-55-06.227389.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T22-55-06.227389.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T22-55-06.227389.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-01T22-55-06.227389.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-01T22-55-06.227389.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T22-55-06.227389.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T22-55-06.227389.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T22-55-06.227389.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T22-55-06.227389.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-01T22-55-06.227389.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-01T22-55-06.227389.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-01T22-55-06.227389.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T22-55-06.227389.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-01T22-55-06.227389.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T22-55-06.227389.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T22-55-06.227389.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-01T22-55-06.227389.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-01T22-55-06.227389.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-01T22-55-06.227389.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T22-55-06.227389.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-01T22-55-06.227389.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-01T22-55-06.227389.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T22-55-06.227389.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-01T22-55-06.227389.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-01T22-55-06.227389.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T22-55-06.227389.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T22-55-06.227389.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-01T22-55-06.227389.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T22-55-06.227389.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T22-55-06.227389.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T22-55-06.227389.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T22-55-06.227389.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-01T22-55-06.227389.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-01T22-55-06.227389.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T22-55-06.227389.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-01T22-55-06.227389.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T22-55-06.227389.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T22-55-06.227389.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T22-55-06.227389.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-01T22-55-06.227389.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T22-55-06.227389.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T22-55-06.227389.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T22-55-06.227389.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T22-55-06.227389.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T22-55-06.227389.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T22-55-06.227389.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T22-55-06.227389.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T22-55-06.227389.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T22-55-06.227389.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T22-55-06.227389.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T22-55-06.227389.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T22-55-06.227389.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T22-55-06.227389.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T22-55-06.227389.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-01T22-55-06.227389.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T22-55-06.227389.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-01T22-55-06.227389.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T22-55-06.227389.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T22-55-06.227389.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T22-55-06.227389.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-01T22-55-06.227389.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-01T22-55-06.227389.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T22-55-06.227389.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T22-55-06.227389.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T22-55-06.227389.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T22-55-06.227389.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-01T22-55-06.227389.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-01T22-55-06.227389.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-01T22-55-06.227389.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T22-55-06.227389.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-01T22-55-06.227389.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T22-55-06.227389.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T22-55-06.227389.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-01T22-55-06.227389.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-01T22-55-06.227389.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-01T22-55-06.227389.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T22-55-06.227389.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-01T22-55-06.227389.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-01T22-55-06.227389.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_01T22_55_06.227389", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T22-55-06.227389.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T22-55-06.227389.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_01T22_55_06.227389", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-01T22-55-06.227389.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-01T22-55-06.227389.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_01T22_55_06.227389", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-01T22-55-06.227389.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-01T22-55-06.227389.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_01T22_55_06.227389", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T22-55-06.227389.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T22-55-06.227389.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_01T22_55_06.227389", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T22-55-06.227389.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T22-55-06.227389.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_01T22_55_06.227389", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-01T22-55-06.227389.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-01T22-55-06.227389.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_01T22_55_06.227389", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T22-55-06.227389.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T22-55-06.227389.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_01T22_55_06.227389", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T22-55-06.227389.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T22-55-06.227389.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_01T22_55_06.227389", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T22-55-06.227389.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T22-55-06.227389.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_01T22_55_06.227389", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T22-55-06.227389.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T22-55-06.227389.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_01T22_55_06.227389", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-01T22-55-06.227389.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-01T22-55-06.227389.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_01T22_55_06.227389", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-01T22-55-06.227389.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-01T22-55-06.227389.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_01T22_55_06.227389", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T22-55-06.227389.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T22-55-06.227389.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_01T22_55_06.227389", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-01T22-55-06.227389.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-01T22-55-06.227389.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_01T22_55_06.227389", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T22-55-06.227389.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T22-55-06.227389.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_01T22_55_06.227389", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T22-55-06.227389.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T22-55-06.227389.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_01T22_55_06.227389", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T22-55-06.227389.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T22-55-06.227389.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_01T22_55_06.227389", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-01T22-55-06.227389.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-01T22-55-06.227389.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_01T22_55_06.227389", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T22-55-06.227389.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T22-55-06.227389.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_01T22_55_06.227389", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T22-55-06.227389.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T22-55-06.227389.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_01T22_55_06.227389", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T22-55-06.227389.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T22-55-06.227389.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_01T22_55_06.227389", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T22-55-06.227389.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T22-55-06.227389.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_01T22_55_06.227389", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T22-55-06.227389.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T22-55-06.227389.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_01T22_55_06.227389", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T22-55-06.227389.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T22-55-06.227389.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_01T22_55_06.227389", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T22-55-06.227389.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T22-55-06.227389.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_01T22_55_06.227389", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T22-55-06.227389.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T22-55-06.227389.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_01T22_55_06.227389", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T22-55-06.227389.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T22-55-06.227389.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_01T22_55_06.227389", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T22-55-06.227389.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T22-55-06.227389.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_01T22_55_06.227389", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T22-55-06.227389.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T22-55-06.227389.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_01T22_55_06.227389", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T22-55-06.227389.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T22-55-06.227389.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_01T22_55_06.227389", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T22-55-06.227389.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T22-55-06.227389.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_01T22_55_06.227389", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T22-55-06.227389.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T22-55-06.227389.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_01T22_55_06.227389", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-01T22-55-06.227389.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-01T22-55-06.227389.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_01T22_55_06.227389", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T22-55-06.227389.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T22-55-06.227389.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_01T22_55_06.227389", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-01T22-55-06.227389.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-01T22-55-06.227389.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_01T22_55_06.227389", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T22-55-06.227389.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T22-55-06.227389.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_01T22_55_06.227389", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T22-55-06.227389.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T22-55-06.227389.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_01T22_55_06.227389", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T22-55-06.227389.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T22-55-06.227389.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_01T22_55_06.227389", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-01T22-55-06.227389.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-01T22-55-06.227389.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_01T22_55_06.227389", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-01T22-55-06.227389.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-01T22-55-06.227389.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_01T22_55_06.227389", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T22-55-06.227389.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T22-55-06.227389.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_01T22_55_06.227389", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T22-55-06.227389.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T22-55-06.227389.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_01T22_55_06.227389", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T22-55-06.227389.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T22-55-06.227389.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_01T22_55_06.227389", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T22-55-06.227389.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T22-55-06.227389.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_01T22_55_06.227389", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-01T22-55-06.227389.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-01T22-55-06.227389.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_01T22_55_06.227389", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-01T22-55-06.227389.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-01T22-55-06.227389.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_01T22_55_06.227389", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-01T22-55-06.227389.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-01T22-55-06.227389.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_01T22_55_06.227389", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T22-55-06.227389.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T22-55-06.227389.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_01T22_55_06.227389", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-01T22-55-06.227389.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-01T22-55-06.227389.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_01T22_55_06.227389", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T22-55-06.227389.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T22-55-06.227389.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_01T22_55_06.227389", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T22-55-06.227389.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T22-55-06.227389.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_01T22_55_06.227389", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-01T22-55-06.227389.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-01T22-55-06.227389.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_01T22_55_06.227389", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-01T22-55-06.227389.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-01T22-55-06.227389.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_01T22_55_06.227389", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-01T22-55-06.227389.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-01T22-55-06.227389.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_01T22_55_06.227389", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T22-55-06.227389.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T22-55-06.227389.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_01T22_55_06.227389", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-01T22-55-06.227389.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-01T22-55-06.227389.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_01T22_55_06.227389", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-01T22-55-06.227389.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-01T22-55-06.227389.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_01T22_55_06.227389", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-01T22-55-06.227389.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-01T22-55-06.227389.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_01T22_55_06.227389", "path": ["**/details_harness|winogrande|5_2024-02-01T22-55-06.227389.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-01T22-55-06.227389.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_01T22_55_06.227389", "path": ["results_2024-02-01T22-55-06.227389.parquet"]}, {"split": "latest", "path": ["results_2024-02-01T22-55-06.227389.parquet"]}]}]}
2024-02-01T22:57:48+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of vanillaOVO/supermario_v4 Dataset automatically created during the evaluation run of model vanillaOVO/supermario_v4 on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-02-01T22:55:06.227389(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of vanillaOVO/supermario_v4\n\n\n\nDataset automatically created during the evaluation run of model vanillaOVO/supermario_v4 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-02-01T22:55:06.227389(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of vanillaOVO/supermario_v4\n\n\n\nDataset automatically created during the evaluation run of model vanillaOVO/supermario_v4 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-02-01T22:55:06.227389(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
c3463ada6ad1b87c2a77ace383a858571a16949e
# Dataset Card for Evaluation run of h2oai/h2o-danube-1.8b-chat <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [h2oai/h2o-danube-1.8b-chat](https://huggingface.co/h2oai/h2o-danube-1.8b-chat) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_h2oai__h2o-danube-1.8b-chat", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-02-01T23:01:46.561658](https://huggingface.co/datasets/open-llm-leaderboard/details_h2oai__h2o-danube-1.8b-chat/blob/main/results_2024-02-01T23-01-46.561658.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.3403400293103947, "acc_stderr": 0.03328323264170753, "acc_norm": 0.3412403351985156, "acc_norm_stderr": 0.03400489086266728, "mc1": 0.26805385556915545, "mc1_stderr": 0.015506204722834557, "mc2": 0.41637248299191104, "mc2_stderr": 0.01467700936265024 }, "harness|arc:challenge|25": { "acc": 0.3856655290102389, "acc_stderr": 0.01422425097325718, "acc_norm": 0.4112627986348123, "acc_norm_stderr": 0.014379441068522082 }, "harness|hellaswag|10": { "acc": 0.5066719776936865, "acc_stderr": 0.004989337148572074, "acc_norm": 0.6806413065126469, "acc_norm_stderr": 0.004652753439460153 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.27, "acc_stderr": 0.04461960433384741, "acc_norm": 0.27, "acc_norm_stderr": 0.04461960433384741 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.42962962962962964, "acc_stderr": 0.04276349494376599, "acc_norm": 0.42962962962962964, "acc_norm_stderr": 0.04276349494376599 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.3157894736842105, "acc_stderr": 0.0378272898086547, "acc_norm": 0.3157894736842105, "acc_norm_stderr": 0.0378272898086547 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.33, "acc_stderr": 0.04725815626252604, "acc_norm": 0.33, "acc_norm_stderr": 0.04725815626252604 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.3471698113207547, "acc_stderr": 0.029300101705549652, "acc_norm": 0.3471698113207547, "acc_norm_stderr": 0.029300101705549652 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.2638888888888889, "acc_stderr": 0.03685651095897532, "acc_norm": 0.2638888888888889, "acc_norm_stderr": 0.03685651095897532 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.24, "acc_stderr": 0.04292346959909283, "acc_norm": 0.24, "acc_norm_stderr": 0.04292346959909283 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.31, "acc_stderr": 0.04648231987117316, "acc_norm": 0.31, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.28, "acc_stderr": 0.04512608598542127, "acc_norm": 0.28, "acc_norm_stderr": 0.04512608598542127 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.31213872832369943, "acc_stderr": 0.035331333893236574, "acc_norm": 0.31213872832369943, "acc_norm_stderr": 0.035331333893236574 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.12745098039215685, "acc_stderr": 0.033182249219420756, "acc_norm": 0.12745098039215685, "acc_norm_stderr": 0.033182249219420756 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.34, "acc_stderr": 0.04760952285695235, "acc_norm": 0.34, "acc_norm_stderr": 0.04760952285695235 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.2553191489361702, "acc_stderr": 0.02850485647051419, "acc_norm": 0.2553191489361702, "acc_norm_stderr": 0.02850485647051419 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.2719298245614035, "acc_stderr": 0.041857744240220554, "acc_norm": 0.2719298245614035, "acc_norm_stderr": 0.041857744240220554 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.32413793103448274, "acc_stderr": 0.03900432069185553, "acc_norm": 0.32413793103448274, "acc_norm_stderr": 0.03900432069185553 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.29894179894179895, "acc_stderr": 0.023577604791655805, "acc_norm": 0.29894179894179895, "acc_norm_stderr": 0.023577604791655805 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.2698412698412698, "acc_stderr": 0.03970158273235173, "acc_norm": 0.2698412698412698, "acc_norm_stderr": 0.03970158273235173 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.37, "acc_stderr": 0.048523658709391, "acc_norm": 0.37, "acc_norm_stderr": 0.048523658709391 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.3741935483870968, "acc_stderr": 0.027528904299845787, "acc_norm": 0.3741935483870968, "acc_norm_stderr": 0.027528904299845787 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.23645320197044334, "acc_stderr": 0.029896114291733552, "acc_norm": 0.23645320197044334, "acc_norm_stderr": 0.029896114291733552 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.33, "acc_stderr": 0.04725815626252604, "acc_norm": 0.33, "acc_norm_stderr": 0.04725815626252604 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.41818181818181815, "acc_stderr": 0.03851716319398394, "acc_norm": 0.41818181818181815, "acc_norm_stderr": 0.03851716319398394 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.46464646464646464, "acc_stderr": 0.035534363688280626, "acc_norm": 0.46464646464646464, "acc_norm_stderr": 0.035534363688280626 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.37823834196891193, "acc_stderr": 0.03499807276193338, "acc_norm": 0.37823834196891193, "acc_norm_stderr": 0.03499807276193338 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.26666666666666666, "acc_stderr": 0.02242127361292371, "acc_norm": 0.26666666666666666, "acc_norm_stderr": 0.02242127361292371 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.25555555555555554, "acc_stderr": 0.026593939101844058, "acc_norm": 0.25555555555555554, "acc_norm_stderr": 0.026593939101844058 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.2815126050420168, "acc_stderr": 0.029213549414372153, "acc_norm": 0.2815126050420168, "acc_norm_stderr": 0.029213549414372153 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.31125827814569534, "acc_stderr": 0.03780445850526733, "acc_norm": 0.31125827814569534, "acc_norm_stderr": 0.03780445850526733 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.3431192660550459, "acc_stderr": 0.02035477773608604, "acc_norm": 0.3431192660550459, "acc_norm_stderr": 0.02035477773608604 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.30092592592592593, "acc_stderr": 0.031280390843298825, "acc_norm": 0.30092592592592593, "acc_norm_stderr": 0.031280390843298825 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.4068627450980392, "acc_stderr": 0.03447891136353382, "acc_norm": 0.4068627450980392, "acc_norm_stderr": 0.03447891136353382 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.46835443037974683, "acc_stderr": 0.03248197400511075, "acc_norm": 0.46835443037974683, "acc_norm_stderr": 0.03248197400511075 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.3811659192825112, "acc_stderr": 0.032596251184168284, "acc_norm": 0.3811659192825112, "acc_norm_stderr": 0.032596251184168284 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.366412213740458, "acc_stderr": 0.042258754519696386, "acc_norm": 0.366412213740458, "acc_norm_stderr": 0.042258754519696386 }, "harness|hendrycksTest-international_law|5": { "acc": 0.5206611570247934, "acc_stderr": 0.04560456086387235, "acc_norm": 0.5206611570247934, "acc_norm_stderr": 0.04560456086387235 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.3888888888888889, "acc_stderr": 0.047128212574267705, "acc_norm": 0.3888888888888889, "acc_norm_stderr": 0.047128212574267705 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.31901840490797545, "acc_stderr": 0.03661997551073836, "acc_norm": 0.31901840490797545, "acc_norm_stderr": 0.03661997551073836 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.23214285714285715, "acc_stderr": 0.04007341809755806, "acc_norm": 0.23214285714285715, "acc_norm_stderr": 0.04007341809755806 }, "harness|hendrycksTest-management|5": { "acc": 0.4077669902912621, "acc_stderr": 0.048657775704107696, "acc_norm": 0.4077669902912621, "acc_norm_stderr": 0.048657775704107696 }, "harness|hendrycksTest-marketing|5": { "acc": 0.37606837606837606, "acc_stderr": 0.03173393632969482, "acc_norm": 0.37606837606837606, "acc_norm_stderr": 0.03173393632969482 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.33, "acc_stderr": 0.047258156262526045, "acc_norm": 0.33, "acc_norm_stderr": 0.047258156262526045 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.4648786717752235, "acc_stderr": 0.01783579880629064, "acc_norm": 0.4648786717752235, "acc_norm_stderr": 0.01783579880629064 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.30346820809248554, "acc_stderr": 0.02475241196091722, "acc_norm": 0.30346820809248554, "acc_norm_stderr": 0.02475241196091722 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.2446927374301676, "acc_stderr": 0.014378169884098435, "acc_norm": 0.2446927374301676, "acc_norm_stderr": 0.014378169884098435 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.3758169934640523, "acc_stderr": 0.02773283435336394, "acc_norm": 0.3758169934640523, "acc_norm_stderr": 0.02773283435336394 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.40192926045016075, "acc_stderr": 0.027846476005930477, "acc_norm": 0.40192926045016075, "acc_norm_stderr": 0.027846476005930477 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.3333333333333333, "acc_stderr": 0.026229649178821177, "acc_norm": 0.3333333333333333, "acc_norm_stderr": 0.026229649178821177 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.28368794326241137, "acc_stderr": 0.02689170942834396, "acc_norm": 0.28368794326241137, "acc_norm_stderr": 0.02689170942834396 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.27640156453715775, "acc_stderr": 0.011422153194553567, "acc_norm": 0.27640156453715775, "acc_norm_stderr": 0.011422153194553567 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.33088235294117646, "acc_stderr": 0.028582709753898452, "acc_norm": 0.33088235294117646, "acc_norm_stderr": 0.028582709753898452 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.3333333333333333, "acc_stderr": 0.019070985589687495, "acc_norm": 0.3333333333333333, "acc_norm_stderr": 0.019070985589687495 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.33636363636363636, "acc_stderr": 0.04525393596302506, "acc_norm": 0.33636363636363636, "acc_norm_stderr": 0.04525393596302506 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.3142857142857143, "acc_stderr": 0.029719329422417482, "acc_norm": 0.3142857142857143, "acc_norm_stderr": 0.029719329422417482 }, "harness|hendrycksTest-sociology|5": { "acc": 0.3333333333333333, "acc_stderr": 0.03333333333333334, "acc_norm": 0.3333333333333333, "acc_norm_stderr": 0.03333333333333334 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.49, "acc_stderr": 0.05024183937956912, "acc_norm": 0.49, "acc_norm_stderr": 0.05024183937956912 }, "harness|hendrycksTest-virology|5": { "acc": 0.29518072289156627, "acc_stderr": 0.0355092018568963, "acc_norm": 0.29518072289156627, "acc_norm_stderr": 0.0355092018568963 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.4093567251461988, "acc_stderr": 0.03771283107626544, "acc_norm": 0.4093567251461988, "acc_norm_stderr": 0.03771283107626544 }, "harness|truthfulqa:mc|0": { "mc1": 0.26805385556915545, "mc1_stderr": 0.015506204722834557, "mc2": 0.41637248299191104, "mc2_stderr": 0.01467700936265024 }, "harness|winogrande|5": { "acc": 0.6535122336227308, "acc_stderr": 0.013373773411685646 }, "harness|gsm8k|5": { "acc": 0.17361637604245642, "acc_stderr": 0.010433463221257634 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_h2oai__h2o-danube-1.8b-chat
[ "region:us" ]
2024-02-01T23:04:07+00:00
{"pretty_name": "Evaluation run of h2oai/h2o-danube-1.8b-chat", "dataset_summary": "Dataset automatically created during the evaluation run of model [h2oai/h2o-danube-1.8b-chat](https://huggingface.co/h2oai/h2o-danube-1.8b-chat) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_h2oai__h2o-danube-1.8b-chat\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-01T23:01:46.561658](https://huggingface.co/datasets/open-llm-leaderboard/details_h2oai__h2o-danube-1.8b-chat/blob/main/results_2024-02-01T23-01-46.561658.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.3403400293103947,\n \"acc_stderr\": 0.03328323264170753,\n \"acc_norm\": 0.3412403351985156,\n \"acc_norm_stderr\": 0.03400489086266728,\n \"mc1\": 0.26805385556915545,\n \"mc1_stderr\": 0.015506204722834557,\n \"mc2\": 0.41637248299191104,\n \"mc2_stderr\": 0.01467700936265024\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.3856655290102389,\n \"acc_stderr\": 0.01422425097325718,\n \"acc_norm\": 0.4112627986348123,\n \"acc_norm_stderr\": 0.014379441068522082\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5066719776936865,\n \"acc_stderr\": 0.004989337148572074,\n \"acc_norm\": 0.6806413065126469,\n \"acc_norm_stderr\": 0.004652753439460153\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.27,\n \"acc_stderr\": 0.04461960433384741,\n \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.04461960433384741\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.42962962962962964,\n \"acc_stderr\": 0.04276349494376599,\n \"acc_norm\": 0.42962962962962964,\n \"acc_norm_stderr\": 0.04276349494376599\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.3157894736842105,\n \"acc_stderr\": 0.0378272898086547,\n \"acc_norm\": 0.3157894736842105,\n \"acc_norm_stderr\": 0.0378272898086547\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.3471698113207547,\n \"acc_stderr\": 0.029300101705549652,\n \"acc_norm\": 0.3471698113207547,\n \"acc_norm_stderr\": 0.029300101705549652\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2638888888888889,\n \"acc_stderr\": 0.03685651095897532,\n \"acc_norm\": 0.2638888888888889,\n \"acc_norm_stderr\": 0.03685651095897532\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909283,\n \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909283\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.31213872832369943,\n \"acc_stderr\": 0.035331333893236574,\n \"acc_norm\": 0.31213872832369943,\n \"acc_norm_stderr\": 0.035331333893236574\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.12745098039215685,\n \"acc_stderr\": 0.033182249219420756,\n \"acc_norm\": 0.12745098039215685,\n \"acc_norm_stderr\": 0.033182249219420756\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.2553191489361702,\n \"acc_stderr\": 0.02850485647051419,\n \"acc_norm\": 0.2553191489361702,\n \"acc_norm_stderr\": 0.02850485647051419\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2719298245614035,\n \"acc_stderr\": 0.041857744240220554,\n \"acc_norm\": 0.2719298245614035,\n \"acc_norm_stderr\": 0.041857744240220554\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.32413793103448274,\n \"acc_stderr\": 0.03900432069185553,\n \"acc_norm\": 0.32413793103448274,\n \"acc_norm_stderr\": 0.03900432069185553\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.29894179894179895,\n \"acc_stderr\": 0.023577604791655805,\n \"acc_norm\": 0.29894179894179895,\n \"acc_norm_stderr\": 0.023577604791655805\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2698412698412698,\n \"acc_stderr\": 0.03970158273235173,\n \"acc_norm\": 0.2698412698412698,\n \"acc_norm_stderr\": 0.03970158273235173\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.3741935483870968,\n \"acc_stderr\": 0.027528904299845787,\n \"acc_norm\": 0.3741935483870968,\n \"acc_norm_stderr\": 0.027528904299845787\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.23645320197044334,\n \"acc_stderr\": 0.029896114291733552,\n \"acc_norm\": 0.23645320197044334,\n \"acc_norm_stderr\": 0.029896114291733552\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.41818181818181815,\n \"acc_stderr\": 0.03851716319398394,\n \"acc_norm\": 0.41818181818181815,\n \"acc_norm_stderr\": 0.03851716319398394\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.46464646464646464,\n \"acc_stderr\": 0.035534363688280626,\n \"acc_norm\": 0.46464646464646464,\n \"acc_norm_stderr\": 0.035534363688280626\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.37823834196891193,\n \"acc_stderr\": 0.03499807276193338,\n \"acc_norm\": 0.37823834196891193,\n \"acc_norm_stderr\": 0.03499807276193338\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.26666666666666666,\n \"acc_stderr\": 0.02242127361292371,\n \"acc_norm\": 0.26666666666666666,\n \"acc_norm_stderr\": 0.02242127361292371\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.25555555555555554,\n \"acc_stderr\": 0.026593939101844058,\n \"acc_norm\": 0.25555555555555554,\n \"acc_norm_stderr\": 0.026593939101844058\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.2815126050420168,\n \"acc_stderr\": 0.029213549414372153,\n \"acc_norm\": 0.2815126050420168,\n \"acc_norm_stderr\": 0.029213549414372153\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.31125827814569534,\n \"acc_stderr\": 0.03780445850526733,\n \"acc_norm\": 0.31125827814569534,\n \"acc_norm_stderr\": 0.03780445850526733\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.3431192660550459,\n \"acc_stderr\": 0.02035477773608604,\n \"acc_norm\": 0.3431192660550459,\n \"acc_norm_stderr\": 0.02035477773608604\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.30092592592592593,\n \"acc_stderr\": 0.031280390843298825,\n \"acc_norm\": 0.30092592592592593,\n \"acc_norm_stderr\": 0.031280390843298825\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.4068627450980392,\n \"acc_stderr\": 0.03447891136353382,\n \"acc_norm\": 0.4068627450980392,\n \"acc_norm_stderr\": 0.03447891136353382\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.46835443037974683,\n \"acc_stderr\": 0.03248197400511075,\n \"acc_norm\": 0.46835443037974683,\n \"acc_norm_stderr\": 0.03248197400511075\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.3811659192825112,\n \"acc_stderr\": 0.032596251184168284,\n \"acc_norm\": 0.3811659192825112,\n \"acc_norm_stderr\": 0.032596251184168284\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.366412213740458,\n \"acc_stderr\": 0.042258754519696386,\n \"acc_norm\": 0.366412213740458,\n \"acc_norm_stderr\": 0.042258754519696386\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.5206611570247934,\n \"acc_stderr\": 0.04560456086387235,\n \"acc_norm\": 0.5206611570247934,\n \"acc_norm_stderr\": 0.04560456086387235\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.3888888888888889,\n \"acc_stderr\": 0.047128212574267705,\n \"acc_norm\": 0.3888888888888889,\n \"acc_norm_stderr\": 0.047128212574267705\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.31901840490797545,\n \"acc_stderr\": 0.03661997551073836,\n \"acc_norm\": 0.31901840490797545,\n \"acc_norm_stderr\": 0.03661997551073836\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.23214285714285715,\n \"acc_stderr\": 0.04007341809755806,\n \"acc_norm\": 0.23214285714285715,\n \"acc_norm_stderr\": 0.04007341809755806\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.4077669902912621,\n \"acc_stderr\": 0.048657775704107696,\n \"acc_norm\": 0.4077669902912621,\n \"acc_norm_stderr\": 0.048657775704107696\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.37606837606837606,\n \"acc_stderr\": 0.03173393632969482,\n \"acc_norm\": 0.37606837606837606,\n \"acc_norm_stderr\": 0.03173393632969482\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.4648786717752235,\n \"acc_stderr\": 0.01783579880629064,\n \"acc_norm\": 0.4648786717752235,\n \"acc_norm_stderr\": 0.01783579880629064\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.30346820809248554,\n \"acc_stderr\": 0.02475241196091722,\n \"acc_norm\": 0.30346820809248554,\n \"acc_norm_stderr\": 0.02475241196091722\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2446927374301676,\n \"acc_stderr\": 0.014378169884098435,\n \"acc_norm\": 0.2446927374301676,\n \"acc_norm_stderr\": 0.014378169884098435\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.3758169934640523,\n \"acc_stderr\": 0.02773283435336394,\n \"acc_norm\": 0.3758169934640523,\n \"acc_norm_stderr\": 0.02773283435336394\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.40192926045016075,\n \"acc_stderr\": 0.027846476005930477,\n \"acc_norm\": 0.40192926045016075,\n \"acc_norm_stderr\": 0.027846476005930477\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.026229649178821177,\n \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.026229649178821177\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.28368794326241137,\n \"acc_stderr\": 0.02689170942834396,\n \"acc_norm\": 0.28368794326241137,\n \"acc_norm_stderr\": 0.02689170942834396\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.27640156453715775,\n \"acc_stderr\": 0.011422153194553567,\n \"acc_norm\": 0.27640156453715775,\n \"acc_norm_stderr\": 0.011422153194553567\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.33088235294117646,\n \"acc_stderr\": 0.028582709753898452,\n \"acc_norm\": 0.33088235294117646,\n \"acc_norm_stderr\": 0.028582709753898452\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.019070985589687495,\n \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.019070985589687495\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.33636363636363636,\n \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.33636363636363636,\n \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.3142857142857143,\n \"acc_stderr\": 0.029719329422417482,\n \"acc_norm\": 0.3142857142857143,\n \"acc_norm_stderr\": 0.029719329422417482\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.03333333333333334,\n \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.03333333333333334\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.29518072289156627,\n \"acc_stderr\": 0.0355092018568963,\n \"acc_norm\": 0.29518072289156627,\n \"acc_norm_stderr\": 0.0355092018568963\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.4093567251461988,\n \"acc_stderr\": 0.03771283107626544,\n \"acc_norm\": 0.4093567251461988,\n \"acc_norm_stderr\": 0.03771283107626544\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.26805385556915545,\n \"mc1_stderr\": 0.015506204722834557,\n \"mc2\": 0.41637248299191104,\n \"mc2_stderr\": 0.01467700936265024\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.6535122336227308,\n \"acc_stderr\": 0.013373773411685646\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.17361637604245642,\n \"acc_stderr\": 0.010433463221257634\n }\n}\n```", "repo_url": "https://huggingface.co/h2oai/h2o-danube-1.8b-chat", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_01T23_01_46.561658", "path": ["**/details_harness|arc:challenge|25_2024-02-01T23-01-46.561658.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-01T23-01-46.561658.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_01T23_01_46.561658", "path": ["**/details_harness|gsm8k|5_2024-02-01T23-01-46.561658.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-01T23-01-46.561658.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_01T23_01_46.561658", "path": ["**/details_harness|hellaswag|10_2024-02-01T23-01-46.561658.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-01T23-01-46.561658.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_01T23_01_46.561658", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T23-01-46.561658.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-01T23-01-46.561658.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-01T23-01-46.561658.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T23-01-46.561658.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T23-01-46.561658.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-01T23-01-46.561658.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T23-01-46.561658.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T23-01-46.561658.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T23-01-46.561658.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T23-01-46.561658.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-01T23-01-46.561658.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-01T23-01-46.561658.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T23-01-46.561658.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-01T23-01-46.561658.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T23-01-46.561658.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T23-01-46.561658.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T23-01-46.561658.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-01T23-01-46.561658.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T23-01-46.561658.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T23-01-46.561658.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T23-01-46.561658.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T23-01-46.561658.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T23-01-46.561658.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T23-01-46.561658.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T23-01-46.561658.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T23-01-46.561658.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T23-01-46.561658.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T23-01-46.561658.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T23-01-46.561658.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T23-01-46.561658.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T23-01-46.561658.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T23-01-46.561658.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-01T23-01-46.561658.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T23-01-46.561658.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-01T23-01-46.561658.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T23-01-46.561658.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T23-01-46.561658.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T23-01-46.561658.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-01T23-01-46.561658.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-01T23-01-46.561658.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T23-01-46.561658.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T23-01-46.561658.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T23-01-46.561658.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T23-01-46.561658.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-01T23-01-46.561658.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-01T23-01-46.561658.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-01T23-01-46.561658.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T23-01-46.561658.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-01T23-01-46.561658.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T23-01-46.561658.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T23-01-46.561658.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-01T23-01-46.561658.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-01T23-01-46.561658.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-01T23-01-46.561658.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T23-01-46.561658.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-01T23-01-46.561658.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-01T23-01-46.561658.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T23-01-46.561658.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-01T23-01-46.561658.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-01T23-01-46.561658.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T23-01-46.561658.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T23-01-46.561658.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-01T23-01-46.561658.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T23-01-46.561658.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T23-01-46.561658.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T23-01-46.561658.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T23-01-46.561658.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-01T23-01-46.561658.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-01T23-01-46.561658.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T23-01-46.561658.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-01T23-01-46.561658.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T23-01-46.561658.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T23-01-46.561658.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T23-01-46.561658.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-01T23-01-46.561658.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T23-01-46.561658.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T23-01-46.561658.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T23-01-46.561658.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T23-01-46.561658.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T23-01-46.561658.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T23-01-46.561658.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T23-01-46.561658.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T23-01-46.561658.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T23-01-46.561658.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T23-01-46.561658.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T23-01-46.561658.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T23-01-46.561658.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T23-01-46.561658.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T23-01-46.561658.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-01T23-01-46.561658.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T23-01-46.561658.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-01T23-01-46.561658.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T23-01-46.561658.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T23-01-46.561658.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T23-01-46.561658.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-01T23-01-46.561658.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-01T23-01-46.561658.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T23-01-46.561658.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T23-01-46.561658.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T23-01-46.561658.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T23-01-46.561658.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-01T23-01-46.561658.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-01T23-01-46.561658.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-01T23-01-46.561658.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T23-01-46.561658.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-01T23-01-46.561658.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T23-01-46.561658.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T23-01-46.561658.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-01T23-01-46.561658.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-01T23-01-46.561658.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-01T23-01-46.561658.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T23-01-46.561658.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-01T23-01-46.561658.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-01T23-01-46.561658.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_01T23_01_46.561658", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T23-01-46.561658.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T23-01-46.561658.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_01T23_01_46.561658", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-01T23-01-46.561658.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-01T23-01-46.561658.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_01T23_01_46.561658", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-01T23-01-46.561658.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-01T23-01-46.561658.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_01T23_01_46.561658", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T23-01-46.561658.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T23-01-46.561658.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_01T23_01_46.561658", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T23-01-46.561658.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T23-01-46.561658.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_01T23_01_46.561658", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-01T23-01-46.561658.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-01T23-01-46.561658.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_01T23_01_46.561658", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T23-01-46.561658.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T23-01-46.561658.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_01T23_01_46.561658", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T23-01-46.561658.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T23-01-46.561658.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_01T23_01_46.561658", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T23-01-46.561658.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T23-01-46.561658.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_01T23_01_46.561658", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T23-01-46.561658.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T23-01-46.561658.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_01T23_01_46.561658", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-01T23-01-46.561658.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-01T23-01-46.561658.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_01T23_01_46.561658", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-01T23-01-46.561658.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-01T23-01-46.561658.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_01T23_01_46.561658", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T23-01-46.561658.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T23-01-46.561658.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_01T23_01_46.561658", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-01T23-01-46.561658.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-01T23-01-46.561658.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_01T23_01_46.561658", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T23-01-46.561658.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T23-01-46.561658.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_01T23_01_46.561658", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T23-01-46.561658.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T23-01-46.561658.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_01T23_01_46.561658", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T23-01-46.561658.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T23-01-46.561658.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_01T23_01_46.561658", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-01T23-01-46.561658.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-01T23-01-46.561658.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_01T23_01_46.561658", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T23-01-46.561658.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T23-01-46.561658.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_01T23_01_46.561658", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T23-01-46.561658.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T23-01-46.561658.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_01T23_01_46.561658", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T23-01-46.561658.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T23-01-46.561658.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_01T23_01_46.561658", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T23-01-46.561658.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T23-01-46.561658.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_01T23_01_46.561658", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T23-01-46.561658.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T23-01-46.561658.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_01T23_01_46.561658", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T23-01-46.561658.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T23-01-46.561658.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_01T23_01_46.561658", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T23-01-46.561658.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T23-01-46.561658.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_01T23_01_46.561658", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T23-01-46.561658.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T23-01-46.561658.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_01T23_01_46.561658", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T23-01-46.561658.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T23-01-46.561658.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_01T23_01_46.561658", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T23-01-46.561658.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T23-01-46.561658.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_01T23_01_46.561658", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T23-01-46.561658.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T23-01-46.561658.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_01T23_01_46.561658", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T23-01-46.561658.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T23-01-46.561658.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_01T23_01_46.561658", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T23-01-46.561658.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T23-01-46.561658.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_01T23_01_46.561658", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T23-01-46.561658.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T23-01-46.561658.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_01T23_01_46.561658", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-01T23-01-46.561658.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-01T23-01-46.561658.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_01T23_01_46.561658", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T23-01-46.561658.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T23-01-46.561658.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_01T23_01_46.561658", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-01T23-01-46.561658.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-01T23-01-46.561658.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_01T23_01_46.561658", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T23-01-46.561658.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T23-01-46.561658.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_01T23_01_46.561658", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T23-01-46.561658.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T23-01-46.561658.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_01T23_01_46.561658", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T23-01-46.561658.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T23-01-46.561658.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_01T23_01_46.561658", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-01T23-01-46.561658.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-01T23-01-46.561658.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_01T23_01_46.561658", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-01T23-01-46.561658.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-01T23-01-46.561658.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_01T23_01_46.561658", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T23-01-46.561658.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T23-01-46.561658.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_01T23_01_46.561658", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T23-01-46.561658.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T23-01-46.561658.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_01T23_01_46.561658", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T23-01-46.561658.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T23-01-46.561658.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_01T23_01_46.561658", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T23-01-46.561658.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T23-01-46.561658.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_01T23_01_46.561658", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-01T23-01-46.561658.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-01T23-01-46.561658.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_01T23_01_46.561658", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-01T23-01-46.561658.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-01T23-01-46.561658.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_01T23_01_46.561658", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-01T23-01-46.561658.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-01T23-01-46.561658.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_01T23_01_46.561658", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T23-01-46.561658.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T23-01-46.561658.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_01T23_01_46.561658", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-01T23-01-46.561658.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-01T23-01-46.561658.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_01T23_01_46.561658", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T23-01-46.561658.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T23-01-46.561658.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_01T23_01_46.561658", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T23-01-46.561658.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T23-01-46.561658.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_01T23_01_46.561658", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-01T23-01-46.561658.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-01T23-01-46.561658.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_01T23_01_46.561658", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-01T23-01-46.561658.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-01T23-01-46.561658.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_01T23_01_46.561658", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-01T23-01-46.561658.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-01T23-01-46.561658.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_01T23_01_46.561658", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T23-01-46.561658.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T23-01-46.561658.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_01T23_01_46.561658", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-01T23-01-46.561658.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-01T23-01-46.561658.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_01T23_01_46.561658", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-01T23-01-46.561658.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-01T23-01-46.561658.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_01T23_01_46.561658", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-01T23-01-46.561658.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-01T23-01-46.561658.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_01T23_01_46.561658", "path": ["**/details_harness|winogrande|5_2024-02-01T23-01-46.561658.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-01T23-01-46.561658.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_01T23_01_46.561658", "path": ["results_2024-02-01T23-01-46.561658.parquet"]}, {"split": "latest", "path": ["results_2024-02-01T23-01-46.561658.parquet"]}]}]}
2024-02-01T23:04:33+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of h2oai/h2o-danube-1.8b-chat Dataset automatically created during the evaluation run of model h2oai/h2o-danube-1.8b-chat on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-02-01T23:01:46.561658(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of h2oai/h2o-danube-1.8b-chat\n\n\n\nDataset automatically created during the evaluation run of model h2oai/h2o-danube-1.8b-chat on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-02-01T23:01:46.561658(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of h2oai/h2o-danube-1.8b-chat\n\n\n\nDataset automatically created during the evaluation run of model h2oai/h2o-danube-1.8b-chat on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-02-01T23:01:46.561658(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
549fbb555aa9a3c787866ee63b875535b128c0b9
# Dataset Card for Evaluation run of Plaban81/Moe-4x7b-math-reason-code <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [Plaban81/Moe-4x7b-math-reason-code](https://huggingface.co/Plaban81/Moe-4x7b-math-reason-code) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_Plaban81__Moe-4x7b-math-reason-code", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-02-01T23:02:06.191650](https://huggingface.co/datasets/open-llm-leaderboard/details_Plaban81__Moe-4x7b-math-reason-code/blob/main/results_2024-02-01T23-02-06.191650.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.61366513110882, "acc_stderr": 0.033073367558772764, "acc_norm": 0.6160976239521233, "acc_norm_stderr": 0.03373653941415801, "mc1": 0.4039167686658507, "mc1_stderr": 0.017177276822584284, "mc2": 0.5611833547243651, "mc2_stderr": 0.015990413066061377 }, "harness|arc:challenge|25": { "acc": 0.5887372013651877, "acc_stderr": 0.014379441068522084, "acc_norm": 0.6254266211604096, "acc_norm_stderr": 0.014144193471893454 }, "harness|hellaswag|10": { "acc": 0.652459669388568, "acc_stderr": 0.004752158936871871, "acc_norm": 0.8386775542720574, "acc_norm_stderr": 0.0036707636737929633 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.32, "acc_stderr": 0.046882617226215034, "acc_norm": 0.32, "acc_norm_stderr": 0.046882617226215034 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.6074074074074074, "acc_stderr": 0.04218506215368879, "acc_norm": 0.6074074074074074, "acc_norm_stderr": 0.04218506215368879 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.6776315789473685, "acc_stderr": 0.03803510248351585, "acc_norm": 0.6776315789473685, "acc_norm_stderr": 0.03803510248351585 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.56, "acc_stderr": 0.04988876515698589, "acc_norm": 0.56, "acc_norm_stderr": 0.04988876515698589 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.6641509433962264, "acc_stderr": 0.029067220146644826, "acc_norm": 0.6641509433962264, "acc_norm_stderr": 0.029067220146644826 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.7361111111111112, "acc_stderr": 0.03685651095897532, "acc_norm": 0.7361111111111112, "acc_norm_stderr": 0.03685651095897532 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.51, "acc_stderr": 0.05024183937956912, "acc_norm": 0.51, "acc_norm_stderr": 0.05024183937956912 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.55, "acc_stderr": 0.05, "acc_norm": 0.55, "acc_norm_stderr": 0.05 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.34, "acc_stderr": 0.04760952285695235, "acc_norm": 0.34, "acc_norm_stderr": 0.04760952285695235 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6127167630057804, "acc_stderr": 0.03714325906302065, "acc_norm": 0.6127167630057804, "acc_norm_stderr": 0.03714325906302065 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.3627450980392157, "acc_stderr": 0.047840607041056527, "acc_norm": 0.3627450980392157, "acc_norm_stderr": 0.047840607041056527 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.76, "acc_stderr": 0.04292346959909284, "acc_norm": 0.76, "acc_norm_stderr": 0.04292346959909284 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.5191489361702127, "acc_stderr": 0.03266204299064678, "acc_norm": 0.5191489361702127, "acc_norm_stderr": 0.03266204299064678 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.45614035087719296, "acc_stderr": 0.046854730419077895, "acc_norm": 0.45614035087719296, "acc_norm_stderr": 0.046854730419077895 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5379310344827586, "acc_stderr": 0.04154659671707548, "acc_norm": 0.5379310344827586, "acc_norm_stderr": 0.04154659671707548 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.3994708994708995, "acc_stderr": 0.02522545028406788, "acc_norm": 0.3994708994708995, "acc_norm_stderr": 0.02522545028406788 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.38095238095238093, "acc_stderr": 0.04343525428949098, "acc_norm": 0.38095238095238093, "acc_norm_stderr": 0.04343525428949098 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.38, "acc_stderr": 0.048783173121456316, "acc_norm": 0.38, "acc_norm_stderr": 0.048783173121456316 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.5967741935483871, "acc_stderr": 0.027906150826041143, "acc_norm": 0.5967741935483871, "acc_norm_stderr": 0.027906150826041143 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.4975369458128079, "acc_stderr": 0.03517945038691063, "acc_norm": 0.4975369458128079, "acc_norm_stderr": 0.03517945038691063 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.66, "acc_stderr": 0.04760952285695237, "acc_norm": 0.66, "acc_norm_stderr": 0.04760952285695237 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7636363636363637, "acc_stderr": 0.03317505930009181, "acc_norm": 0.7636363636363637, "acc_norm_stderr": 0.03317505930009181 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.7777777777777778, "acc_stderr": 0.02962022787479049, "acc_norm": 0.7777777777777778, "acc_norm_stderr": 0.02962022787479049 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.844559585492228, "acc_stderr": 0.02614848346915332, "acc_norm": 0.844559585492228, "acc_norm_stderr": 0.02614848346915332 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.5871794871794872, "acc_stderr": 0.024962683564331796, "acc_norm": 0.5871794871794872, "acc_norm_stderr": 0.024962683564331796 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.34074074074074073, "acc_stderr": 0.028897748741131143, "acc_norm": 0.34074074074074073, "acc_norm_stderr": 0.028897748741131143 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.6218487394957983, "acc_stderr": 0.03149930577784906, "acc_norm": 0.6218487394957983, "acc_norm_stderr": 0.03149930577784906 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.36423841059602646, "acc_stderr": 0.03929111781242742, "acc_norm": 0.36423841059602646, "acc_norm_stderr": 0.03929111781242742 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8, "acc_stderr": 0.01714985851425095, "acc_norm": 0.8, "acc_norm_stderr": 0.01714985851425095 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.4722222222222222, "acc_stderr": 0.0340470532865388, "acc_norm": 0.4722222222222222, "acc_norm_stderr": 0.0340470532865388 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.7647058823529411, "acc_stderr": 0.029771775228145635, "acc_norm": 0.7647058823529411, "acc_norm_stderr": 0.029771775228145635 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.7848101265822784, "acc_stderr": 0.026750826994676166, "acc_norm": 0.7848101265822784, "acc_norm_stderr": 0.026750826994676166 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6457399103139013, "acc_stderr": 0.032100621541349864, "acc_norm": 0.6457399103139013, "acc_norm_stderr": 0.032100621541349864 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.732824427480916, "acc_stderr": 0.03880848301082393, "acc_norm": 0.732824427480916, "acc_norm_stderr": 0.03880848301082393 }, "harness|hendrycksTest-international_law|5": { "acc": 0.8016528925619835, "acc_stderr": 0.03640118271990947, "acc_norm": 0.8016528925619835, "acc_norm_stderr": 0.03640118271990947 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7222222222222222, "acc_stderr": 0.04330043749650743, "acc_norm": 0.7222222222222222, "acc_norm_stderr": 0.04330043749650743 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7116564417177914, "acc_stderr": 0.035590395316173425, "acc_norm": 0.7116564417177914, "acc_norm_stderr": 0.035590395316173425 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.4375, "acc_stderr": 0.04708567521880525, "acc_norm": 0.4375, "acc_norm_stderr": 0.04708567521880525 }, "harness|hendrycksTest-management|5": { "acc": 0.7475728155339806, "acc_stderr": 0.04301250399690878, "acc_norm": 0.7475728155339806, "acc_norm_stderr": 0.04301250399690878 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8461538461538461, "acc_stderr": 0.02363687331748929, "acc_norm": 0.8461538461538461, "acc_norm_stderr": 0.02363687331748929 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.71, "acc_stderr": 0.04560480215720684, "acc_norm": 0.71, "acc_norm_stderr": 0.04560480215720684 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8109833971902938, "acc_stderr": 0.014000791294406999, "acc_norm": 0.8109833971902938, "acc_norm_stderr": 0.014000791294406999 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.7052023121387283, "acc_stderr": 0.024547617794803828, "acc_norm": 0.7052023121387283, "acc_norm_stderr": 0.024547617794803828 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.376536312849162, "acc_stderr": 0.016204672385106603, "acc_norm": 0.376536312849162, "acc_norm_stderr": 0.016204672385106603 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.6928104575163399, "acc_stderr": 0.026415601914388992, "acc_norm": 0.6928104575163399, "acc_norm_stderr": 0.026415601914388992 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.707395498392283, "acc_stderr": 0.025839898334877983, "acc_norm": 0.707395498392283, "acc_norm_stderr": 0.025839898334877983 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.691358024691358, "acc_stderr": 0.025702640260603742, "acc_norm": 0.691358024691358, "acc_norm_stderr": 0.025702640260603742 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.46808510638297873, "acc_stderr": 0.029766675075873862, "acc_norm": 0.46808510638297873, "acc_norm_stderr": 0.029766675075873862 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.4426336375488918, "acc_stderr": 0.012685906538206242, "acc_norm": 0.4426336375488918, "acc_norm_stderr": 0.012685906538206242 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.6507352941176471, "acc_stderr": 0.02895975519682487, "acc_norm": 0.6507352941176471, "acc_norm_stderr": 0.02895975519682487 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6633986928104575, "acc_stderr": 0.01911721391149515, "acc_norm": 0.6633986928104575, "acc_norm_stderr": 0.01911721391149515 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6272727272727273, "acc_stderr": 0.04631381319425465, "acc_norm": 0.6272727272727273, "acc_norm_stderr": 0.04631381319425465 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.7346938775510204, "acc_stderr": 0.0282638899437846, "acc_norm": 0.7346938775510204, "acc_norm_stderr": 0.0282638899437846 }, "harness|hendrycksTest-sociology|5": { "acc": 0.5572139303482587, "acc_stderr": 0.03512310964123937, "acc_norm": 0.5572139303482587, "acc_norm_stderr": 0.03512310964123937 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.81, "acc_stderr": 0.03942772444036625, "acc_norm": 0.81, "acc_norm_stderr": 0.03942772444036625 }, "harness|hendrycksTest-virology|5": { "acc": 0.4819277108433735, "acc_stderr": 0.038899512528272166, "acc_norm": 0.4819277108433735, "acc_norm_stderr": 0.038899512528272166 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8596491228070176, "acc_stderr": 0.026640582539133196, "acc_norm": 0.8596491228070176, "acc_norm_stderr": 0.026640582539133196 }, "harness|truthfulqa:mc|0": { "mc1": 0.4039167686658507, "mc1_stderr": 0.017177276822584284, "mc2": 0.5611833547243651, "mc2_stderr": 0.015990413066061377 }, "harness|winogrande|5": { "acc": 0.760852407261247, "acc_stderr": 0.011988541844843914 }, "harness|gsm8k|5": { "acc": 0.5458680818802123, "acc_stderr": 0.013714410945264549 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_Plaban81__Moe-4x7b-math-reason-code
[ "region:us" ]
2024-02-01T23:04:24+00:00
{"pretty_name": "Evaluation run of Plaban81/Moe-4x7b-math-reason-code", "dataset_summary": "Dataset automatically created during the evaluation run of model [Plaban81/Moe-4x7b-math-reason-code](https://huggingface.co/Plaban81/Moe-4x7b-math-reason-code) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Plaban81__Moe-4x7b-math-reason-code\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-01T23:02:06.191650](https://huggingface.co/datasets/open-llm-leaderboard/details_Plaban81__Moe-4x7b-math-reason-code/blob/main/results_2024-02-01T23-02-06.191650.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.61366513110882,\n \"acc_stderr\": 0.033073367558772764,\n \"acc_norm\": 0.6160976239521233,\n \"acc_norm_stderr\": 0.03373653941415801,\n \"mc1\": 0.4039167686658507,\n \"mc1_stderr\": 0.017177276822584284,\n \"mc2\": 0.5611833547243651,\n \"mc2_stderr\": 0.015990413066061377\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5887372013651877,\n \"acc_stderr\": 0.014379441068522084,\n \"acc_norm\": 0.6254266211604096,\n \"acc_norm_stderr\": 0.014144193471893454\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.652459669388568,\n \"acc_stderr\": 0.004752158936871871,\n \"acc_norm\": 0.8386775542720574,\n \"acc_norm_stderr\": 0.0036707636737929633\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6074074074074074,\n \"acc_stderr\": 0.04218506215368879,\n \"acc_norm\": 0.6074074074074074,\n \"acc_norm_stderr\": 0.04218506215368879\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6776315789473685,\n \"acc_stderr\": 0.03803510248351585,\n \"acc_norm\": 0.6776315789473685,\n \"acc_norm_stderr\": 0.03803510248351585\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6641509433962264,\n \"acc_stderr\": 0.029067220146644826,\n \"acc_norm\": 0.6641509433962264,\n \"acc_norm_stderr\": 0.029067220146644826\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7361111111111112,\n \"acc_stderr\": 0.03685651095897532,\n \"acc_norm\": 0.7361111111111112,\n \"acc_norm_stderr\": 0.03685651095897532\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6127167630057804,\n \"acc_stderr\": 0.03714325906302065,\n \"acc_norm\": 0.6127167630057804,\n \"acc_norm_stderr\": 0.03714325906302065\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.3627450980392157,\n \"acc_stderr\": 0.047840607041056527,\n \"acc_norm\": 0.3627450980392157,\n \"acc_norm_stderr\": 0.047840607041056527\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.04292346959909284,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.04292346959909284\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5191489361702127,\n \"acc_stderr\": 0.03266204299064678,\n \"acc_norm\": 0.5191489361702127,\n \"acc_norm_stderr\": 0.03266204299064678\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.45614035087719296,\n \"acc_stderr\": 0.046854730419077895,\n \"acc_norm\": 0.45614035087719296,\n \"acc_norm_stderr\": 0.046854730419077895\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5379310344827586,\n \"acc_stderr\": 0.04154659671707548,\n \"acc_norm\": 0.5379310344827586,\n \"acc_norm_stderr\": 0.04154659671707548\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3994708994708995,\n \"acc_stderr\": 0.02522545028406788,\n \"acc_norm\": 0.3994708994708995,\n \"acc_norm_stderr\": 0.02522545028406788\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.38095238095238093,\n \"acc_stderr\": 0.04343525428949098,\n \"acc_norm\": 0.38095238095238093,\n \"acc_norm_stderr\": 0.04343525428949098\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.5967741935483871,\n \"acc_stderr\": 0.027906150826041143,\n \"acc_norm\": 0.5967741935483871,\n \"acc_norm_stderr\": 0.027906150826041143\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4975369458128079,\n \"acc_stderr\": 0.03517945038691063,\n \"acc_norm\": 0.4975369458128079,\n \"acc_norm_stderr\": 0.03517945038691063\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\": 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7636363636363637,\n \"acc_stderr\": 0.03317505930009181,\n \"acc_norm\": 0.7636363636363637,\n \"acc_norm_stderr\": 0.03317505930009181\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.02962022787479049,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.02962022787479049\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.844559585492228,\n \"acc_stderr\": 0.02614848346915332,\n \"acc_norm\": 0.844559585492228,\n \"acc_norm_stderr\": 0.02614848346915332\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.5871794871794872,\n \"acc_stderr\": 0.024962683564331796,\n \"acc_norm\": 0.5871794871794872,\n \"acc_norm_stderr\": 0.024962683564331796\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.34074074074074073,\n \"acc_stderr\": 0.028897748741131143,\n \"acc_norm\": 0.34074074074074073,\n \"acc_norm_stderr\": 0.028897748741131143\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6218487394957983,\n \"acc_stderr\": 0.03149930577784906,\n \"acc_norm\": 0.6218487394957983,\n \"acc_norm_stderr\": 0.03149930577784906\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.36423841059602646,\n \"acc_stderr\": 0.03929111781242742,\n \"acc_norm\": 0.36423841059602646,\n \"acc_norm_stderr\": 0.03929111781242742\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8,\n \"acc_stderr\": 0.01714985851425095,\n \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.01714985851425095\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4722222222222222,\n \"acc_stderr\": 0.0340470532865388,\n \"acc_norm\": 0.4722222222222222,\n \"acc_norm_stderr\": 0.0340470532865388\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7647058823529411,\n \"acc_stderr\": 0.029771775228145635,\n \"acc_norm\": 0.7647058823529411,\n \"acc_norm_stderr\": 0.029771775228145635\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7848101265822784,\n \"acc_stderr\": 0.026750826994676166,\n \"acc_norm\": 0.7848101265822784,\n \"acc_norm_stderr\": 0.026750826994676166\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6457399103139013,\n \"acc_stderr\": 0.032100621541349864,\n \"acc_norm\": 0.6457399103139013,\n \"acc_norm_stderr\": 0.032100621541349864\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.732824427480916,\n \"acc_stderr\": 0.03880848301082393,\n \"acc_norm\": 0.732824427480916,\n \"acc_norm_stderr\": 0.03880848301082393\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8016528925619835,\n \"acc_stderr\": 0.03640118271990947,\n \"acc_norm\": 0.8016528925619835,\n \"acc_norm_stderr\": 0.03640118271990947\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.04330043749650743,\n \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.04330043749650743\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7116564417177914,\n \"acc_stderr\": 0.035590395316173425,\n \"acc_norm\": 0.7116564417177914,\n \"acc_norm_stderr\": 0.035590395316173425\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4375,\n \"acc_stderr\": 0.04708567521880525,\n \"acc_norm\": 0.4375,\n \"acc_norm_stderr\": 0.04708567521880525\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7475728155339806,\n \"acc_stderr\": 0.04301250399690878,\n \"acc_norm\": 0.7475728155339806,\n \"acc_norm_stderr\": 0.04301250399690878\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8461538461538461,\n \"acc_stderr\": 0.02363687331748929,\n \"acc_norm\": 0.8461538461538461,\n \"acc_norm_stderr\": 0.02363687331748929\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.04560480215720684,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.04560480215720684\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8109833971902938,\n \"acc_stderr\": 0.014000791294406999,\n \"acc_norm\": 0.8109833971902938,\n \"acc_norm_stderr\": 0.014000791294406999\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7052023121387283,\n \"acc_stderr\": 0.024547617794803828,\n \"acc_norm\": 0.7052023121387283,\n \"acc_norm_stderr\": 0.024547617794803828\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.376536312849162,\n \"acc_stderr\": 0.016204672385106603,\n \"acc_norm\": 0.376536312849162,\n \"acc_norm_stderr\": 0.016204672385106603\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6928104575163399,\n \"acc_stderr\": 0.026415601914388992,\n \"acc_norm\": 0.6928104575163399,\n \"acc_norm_stderr\": 0.026415601914388992\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.707395498392283,\n \"acc_stderr\": 0.025839898334877983,\n \"acc_norm\": 0.707395498392283,\n \"acc_norm_stderr\": 0.025839898334877983\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.691358024691358,\n \"acc_stderr\": 0.025702640260603742,\n \"acc_norm\": 0.691358024691358,\n \"acc_norm_stderr\": 0.025702640260603742\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.46808510638297873,\n \"acc_stderr\": 0.029766675075873862,\n \"acc_norm\": 0.46808510638297873,\n \"acc_norm_stderr\": 0.029766675075873862\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4426336375488918,\n \"acc_stderr\": 0.012685906538206242,\n \"acc_norm\": 0.4426336375488918,\n \"acc_norm_stderr\": 0.012685906538206242\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6507352941176471,\n \"acc_stderr\": 0.02895975519682487,\n \"acc_norm\": 0.6507352941176471,\n \"acc_norm_stderr\": 0.02895975519682487\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6633986928104575,\n \"acc_stderr\": 0.01911721391149515,\n \"acc_norm\": 0.6633986928104575,\n \"acc_norm_stderr\": 0.01911721391149515\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6272727272727273,\n \"acc_stderr\": 0.04631381319425465,\n \"acc_norm\": 0.6272727272727273,\n \"acc_norm_stderr\": 0.04631381319425465\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7346938775510204,\n \"acc_stderr\": 0.0282638899437846,\n \"acc_norm\": 0.7346938775510204,\n \"acc_norm_stderr\": 0.0282638899437846\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.5572139303482587,\n \"acc_stderr\": 0.03512310964123937,\n \"acc_norm\": 0.5572139303482587,\n \"acc_norm_stderr\": 0.03512310964123937\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.81,\n \"acc_stderr\": 0.03942772444036625,\n \"acc_norm\": 0.81,\n \"acc_norm_stderr\": 0.03942772444036625\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4819277108433735,\n \"acc_stderr\": 0.038899512528272166,\n \"acc_norm\": 0.4819277108433735,\n \"acc_norm_stderr\": 0.038899512528272166\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8596491228070176,\n \"acc_stderr\": 0.026640582539133196,\n \"acc_norm\": 0.8596491228070176,\n \"acc_norm_stderr\": 0.026640582539133196\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4039167686658507,\n \"mc1_stderr\": 0.017177276822584284,\n \"mc2\": 0.5611833547243651,\n \"mc2_stderr\": 0.015990413066061377\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.760852407261247,\n \"acc_stderr\": 0.011988541844843914\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5458680818802123,\n \"acc_stderr\": 0.013714410945264549\n }\n}\n```", "repo_url": "https://huggingface.co/Plaban81/Moe-4x7b-math-reason-code", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_01T23_02_06.191650", "path": ["**/details_harness|arc:challenge|25_2024-02-01T23-02-06.191650.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-01T23-02-06.191650.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_01T23_02_06.191650", "path": ["**/details_harness|gsm8k|5_2024-02-01T23-02-06.191650.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-01T23-02-06.191650.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_01T23_02_06.191650", "path": ["**/details_harness|hellaswag|10_2024-02-01T23-02-06.191650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-01T23-02-06.191650.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_01T23_02_06.191650", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T23-02-06.191650.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-01T23-02-06.191650.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-01T23-02-06.191650.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T23-02-06.191650.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T23-02-06.191650.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-01T23-02-06.191650.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T23-02-06.191650.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T23-02-06.191650.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T23-02-06.191650.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T23-02-06.191650.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-01T23-02-06.191650.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-01T23-02-06.191650.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T23-02-06.191650.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-01T23-02-06.191650.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T23-02-06.191650.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T23-02-06.191650.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T23-02-06.191650.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-01T23-02-06.191650.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T23-02-06.191650.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T23-02-06.191650.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T23-02-06.191650.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T23-02-06.191650.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T23-02-06.191650.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T23-02-06.191650.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T23-02-06.191650.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T23-02-06.191650.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T23-02-06.191650.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T23-02-06.191650.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T23-02-06.191650.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T23-02-06.191650.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T23-02-06.191650.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T23-02-06.191650.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-01T23-02-06.191650.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T23-02-06.191650.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-01T23-02-06.191650.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T23-02-06.191650.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T23-02-06.191650.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T23-02-06.191650.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-01T23-02-06.191650.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-01T23-02-06.191650.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T23-02-06.191650.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T23-02-06.191650.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T23-02-06.191650.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T23-02-06.191650.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-01T23-02-06.191650.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-01T23-02-06.191650.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-01T23-02-06.191650.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T23-02-06.191650.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-01T23-02-06.191650.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T23-02-06.191650.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T23-02-06.191650.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-01T23-02-06.191650.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-01T23-02-06.191650.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-01T23-02-06.191650.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T23-02-06.191650.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-01T23-02-06.191650.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-01T23-02-06.191650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T23-02-06.191650.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-01T23-02-06.191650.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-01T23-02-06.191650.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T23-02-06.191650.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T23-02-06.191650.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-01T23-02-06.191650.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T23-02-06.191650.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T23-02-06.191650.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T23-02-06.191650.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T23-02-06.191650.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-01T23-02-06.191650.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-01T23-02-06.191650.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T23-02-06.191650.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-01T23-02-06.191650.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T23-02-06.191650.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T23-02-06.191650.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T23-02-06.191650.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-01T23-02-06.191650.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T23-02-06.191650.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T23-02-06.191650.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T23-02-06.191650.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T23-02-06.191650.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T23-02-06.191650.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T23-02-06.191650.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T23-02-06.191650.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T23-02-06.191650.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T23-02-06.191650.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T23-02-06.191650.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T23-02-06.191650.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T23-02-06.191650.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T23-02-06.191650.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T23-02-06.191650.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-01T23-02-06.191650.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T23-02-06.191650.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-01T23-02-06.191650.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T23-02-06.191650.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T23-02-06.191650.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T23-02-06.191650.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-01T23-02-06.191650.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-01T23-02-06.191650.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T23-02-06.191650.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T23-02-06.191650.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T23-02-06.191650.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T23-02-06.191650.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-01T23-02-06.191650.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-01T23-02-06.191650.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-01T23-02-06.191650.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T23-02-06.191650.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-01T23-02-06.191650.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T23-02-06.191650.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T23-02-06.191650.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-01T23-02-06.191650.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-01T23-02-06.191650.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-01T23-02-06.191650.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T23-02-06.191650.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-01T23-02-06.191650.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-01T23-02-06.191650.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_01T23_02_06.191650", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T23-02-06.191650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T23-02-06.191650.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_01T23_02_06.191650", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-01T23-02-06.191650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-01T23-02-06.191650.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_01T23_02_06.191650", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-01T23-02-06.191650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-01T23-02-06.191650.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_01T23_02_06.191650", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T23-02-06.191650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T23-02-06.191650.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_01T23_02_06.191650", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T23-02-06.191650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T23-02-06.191650.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_01T23_02_06.191650", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-01T23-02-06.191650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-01T23-02-06.191650.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_01T23_02_06.191650", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T23-02-06.191650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T23-02-06.191650.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_01T23_02_06.191650", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T23-02-06.191650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T23-02-06.191650.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_01T23_02_06.191650", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T23-02-06.191650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T23-02-06.191650.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_01T23_02_06.191650", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T23-02-06.191650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T23-02-06.191650.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_01T23_02_06.191650", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-01T23-02-06.191650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-01T23-02-06.191650.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_01T23_02_06.191650", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-01T23-02-06.191650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-01T23-02-06.191650.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_01T23_02_06.191650", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T23-02-06.191650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T23-02-06.191650.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_01T23_02_06.191650", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-01T23-02-06.191650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-01T23-02-06.191650.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_01T23_02_06.191650", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T23-02-06.191650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T23-02-06.191650.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_01T23_02_06.191650", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T23-02-06.191650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T23-02-06.191650.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_01T23_02_06.191650", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T23-02-06.191650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T23-02-06.191650.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_01T23_02_06.191650", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-01T23-02-06.191650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-01T23-02-06.191650.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_01T23_02_06.191650", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T23-02-06.191650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T23-02-06.191650.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_01T23_02_06.191650", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T23-02-06.191650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T23-02-06.191650.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_01T23_02_06.191650", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T23-02-06.191650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T23-02-06.191650.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_01T23_02_06.191650", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T23-02-06.191650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T23-02-06.191650.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_01T23_02_06.191650", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T23-02-06.191650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T23-02-06.191650.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_01T23_02_06.191650", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T23-02-06.191650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T23-02-06.191650.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_01T23_02_06.191650", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T23-02-06.191650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T23-02-06.191650.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_01T23_02_06.191650", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T23-02-06.191650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T23-02-06.191650.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_01T23_02_06.191650", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T23-02-06.191650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T23-02-06.191650.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_01T23_02_06.191650", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T23-02-06.191650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T23-02-06.191650.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_01T23_02_06.191650", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T23-02-06.191650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T23-02-06.191650.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_01T23_02_06.191650", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T23-02-06.191650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T23-02-06.191650.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_01T23_02_06.191650", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T23-02-06.191650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T23-02-06.191650.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_01T23_02_06.191650", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T23-02-06.191650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T23-02-06.191650.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_01T23_02_06.191650", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-01T23-02-06.191650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-01T23-02-06.191650.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_01T23_02_06.191650", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T23-02-06.191650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T23-02-06.191650.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_01T23_02_06.191650", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-01T23-02-06.191650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-01T23-02-06.191650.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_01T23_02_06.191650", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T23-02-06.191650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T23-02-06.191650.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_01T23_02_06.191650", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T23-02-06.191650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T23-02-06.191650.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_01T23_02_06.191650", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T23-02-06.191650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T23-02-06.191650.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_01T23_02_06.191650", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-01T23-02-06.191650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-01T23-02-06.191650.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_01T23_02_06.191650", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-01T23-02-06.191650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-01T23-02-06.191650.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_01T23_02_06.191650", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T23-02-06.191650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T23-02-06.191650.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_01T23_02_06.191650", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T23-02-06.191650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T23-02-06.191650.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_01T23_02_06.191650", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T23-02-06.191650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T23-02-06.191650.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_01T23_02_06.191650", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T23-02-06.191650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T23-02-06.191650.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_01T23_02_06.191650", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-01T23-02-06.191650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-01T23-02-06.191650.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_01T23_02_06.191650", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-01T23-02-06.191650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-01T23-02-06.191650.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_01T23_02_06.191650", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-01T23-02-06.191650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-01T23-02-06.191650.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_01T23_02_06.191650", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T23-02-06.191650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T23-02-06.191650.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_01T23_02_06.191650", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-01T23-02-06.191650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-01T23-02-06.191650.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_01T23_02_06.191650", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T23-02-06.191650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T23-02-06.191650.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_01T23_02_06.191650", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T23-02-06.191650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T23-02-06.191650.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_01T23_02_06.191650", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-01T23-02-06.191650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-01T23-02-06.191650.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_01T23_02_06.191650", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-01T23-02-06.191650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-01T23-02-06.191650.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_01T23_02_06.191650", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-01T23-02-06.191650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-01T23-02-06.191650.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_01T23_02_06.191650", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T23-02-06.191650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T23-02-06.191650.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_01T23_02_06.191650", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-01T23-02-06.191650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-01T23-02-06.191650.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_01T23_02_06.191650", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-01T23-02-06.191650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-01T23-02-06.191650.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_01T23_02_06.191650", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-01T23-02-06.191650.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-01T23-02-06.191650.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_01T23_02_06.191650", "path": ["**/details_harness|winogrande|5_2024-02-01T23-02-06.191650.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-01T23-02-06.191650.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_01T23_02_06.191650", "path": ["results_2024-02-01T23-02-06.191650.parquet"]}, {"split": "latest", "path": ["results_2024-02-01T23-02-06.191650.parquet"]}]}]}
2024-02-01T23:04:47+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of Plaban81/Moe-4x7b-math-reason-code Dataset automatically created during the evaluation run of model Plaban81/Moe-4x7b-math-reason-code on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-02-01T23:02:06.191650(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of Plaban81/Moe-4x7b-math-reason-code\n\n\n\nDataset automatically created during the evaluation run of model Plaban81/Moe-4x7b-math-reason-code on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-02-01T23:02:06.191650(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of Plaban81/Moe-4x7b-math-reason-code\n\n\n\nDataset automatically created during the evaluation run of model Plaban81/Moe-4x7b-math-reason-code on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-02-01T23:02:06.191650(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
489233cd86396525fb8530c6baa4557a75c56ce8
# Dataset Card for Evaluation run of indischepartij/TinyUltra-4x1.1B-Base-Alpha <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [indischepartij/TinyUltra-4x1.1B-Base-Alpha](https://huggingface.co/indischepartij/TinyUltra-4x1.1B-Base-Alpha) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_indischepartij__TinyUltra-4x1.1B-Base-Alpha", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-02-01T23:08:05.664341](https://huggingface.co/datasets/open-llm-leaderboard/details_indischepartij__TinyUltra-4x1.1B-Base-Alpha/blob/main/results_2024-02-01T23-08-05.664341.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.26201452650295837, "acc_stderr": 0.030950575098959248, "acc_norm": 0.26190159146597486, "acc_norm_stderr": 0.03169834440202644, "mc1": 0.2350061199510404, "mc1_stderr": 0.014843061507731618, "mc2": 0.3758799861882878, "mc2_stderr": 0.014070883279660485 }, "harness|arc:challenge|25": { "acc": 0.3447098976109215, "acc_stderr": 0.01388881628678211, "acc_norm": 0.34897610921501704, "acc_norm_stderr": 0.013928933461382504 }, "harness|hellaswag|10": { "acc": 0.46594303923521213, "acc_stderr": 0.004978192893406287, "acc_norm": 0.6142202748456482, "acc_norm_stderr": 0.004857840934549174 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.25, "acc_stderr": 0.04351941398892446, "acc_norm": 0.25, "acc_norm_stderr": 0.04351941398892446 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.2222222222222222, "acc_stderr": 0.035914440841969694, "acc_norm": 0.2222222222222222, "acc_norm_stderr": 0.035914440841969694 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.17763157894736842, "acc_stderr": 0.03110318238312338, "acc_norm": 0.17763157894736842, "acc_norm_stderr": 0.03110318238312338 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.26, "acc_stderr": 0.04408440022768079, "acc_norm": 0.26, "acc_norm_stderr": 0.04408440022768079 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.27169811320754716, "acc_stderr": 0.027377706624670713, "acc_norm": 0.27169811320754716, "acc_norm_stderr": 0.027377706624670713 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.2569444444444444, "acc_stderr": 0.03653946969442099, "acc_norm": 0.2569444444444444, "acc_norm_stderr": 0.03653946969442099 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.27, "acc_stderr": 0.044619604333847394, "acc_norm": 0.27, "acc_norm_stderr": 0.044619604333847394 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.24, "acc_stderr": 0.04292346959909282, "acc_norm": 0.24, "acc_norm_stderr": 0.04292346959909282 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.26, "acc_stderr": 0.0440844002276808, "acc_norm": 0.26, "acc_norm_stderr": 0.0440844002276808 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.2138728323699422, "acc_stderr": 0.03126511206173043, "acc_norm": 0.2138728323699422, "acc_norm_stderr": 0.03126511206173043 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.19607843137254902, "acc_stderr": 0.03950581861179961, "acc_norm": 0.19607843137254902, "acc_norm_stderr": 0.03950581861179961 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.25, "acc_stderr": 0.04351941398892446, "acc_norm": 0.25, "acc_norm_stderr": 0.04351941398892446 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.32340425531914896, "acc_stderr": 0.030579442773610334, "acc_norm": 0.32340425531914896, "acc_norm_stderr": 0.030579442773610334 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.2631578947368421, "acc_stderr": 0.0414243971948936, "acc_norm": 0.2631578947368421, "acc_norm_stderr": 0.0414243971948936 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.23448275862068965, "acc_stderr": 0.035306258743465914, "acc_norm": 0.23448275862068965, "acc_norm_stderr": 0.035306258743465914 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.2566137566137566, "acc_stderr": 0.022494510767503154, "acc_norm": 0.2566137566137566, "acc_norm_stderr": 0.022494510767503154 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.20634920634920634, "acc_stderr": 0.036196045241242515, "acc_norm": 0.20634920634920634, "acc_norm_stderr": 0.036196045241242515 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.3, "acc_stderr": 0.046056618647183814, "acc_norm": 0.3, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.26129032258064516, "acc_stderr": 0.024993053397764826, "acc_norm": 0.26129032258064516, "acc_norm_stderr": 0.024993053397764826 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.27586206896551724, "acc_stderr": 0.031447125816782405, "acc_norm": 0.27586206896551724, "acc_norm_stderr": 0.031447125816782405 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.23, "acc_stderr": 0.04229525846816505, "acc_norm": 0.23, "acc_norm_stderr": 0.04229525846816505 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.24848484848484848, "acc_stderr": 0.03374402644139404, "acc_norm": 0.24848484848484848, "acc_norm_stderr": 0.03374402644139404 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.21717171717171718, "acc_stderr": 0.029376616484945637, "acc_norm": 0.21717171717171718, "acc_norm_stderr": 0.029376616484945637 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.22279792746113988, "acc_stderr": 0.03003114797764154, "acc_norm": 0.22279792746113988, "acc_norm_stderr": 0.03003114797764154 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.23333333333333334, "acc_stderr": 0.021444547301560486, "acc_norm": 0.23333333333333334, "acc_norm_stderr": 0.021444547301560486 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.26296296296296295, "acc_stderr": 0.026842057873833706, "acc_norm": 0.26296296296296295, "acc_norm_stderr": 0.026842057873833706 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.2689075630252101, "acc_stderr": 0.02880139219363128, "acc_norm": 0.2689075630252101, "acc_norm_stderr": 0.02880139219363128 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.2052980132450331, "acc_stderr": 0.03297986648473835, "acc_norm": 0.2052980132450331, "acc_norm_stderr": 0.03297986648473835 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.23669724770642203, "acc_stderr": 0.01822407811729908, "acc_norm": 0.23669724770642203, "acc_norm_stderr": 0.01822407811729908 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.35648148148148145, "acc_stderr": 0.032664783315272714, "acc_norm": 0.35648148148148145, "acc_norm_stderr": 0.032664783315272714 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.22549019607843138, "acc_stderr": 0.02933116229425173, "acc_norm": 0.22549019607843138, "acc_norm_stderr": 0.02933116229425173 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.28270042194092826, "acc_stderr": 0.029312814153955934, "acc_norm": 0.28270042194092826, "acc_norm_stderr": 0.029312814153955934 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.3721973094170404, "acc_stderr": 0.032443052830087304, "acc_norm": 0.3721973094170404, "acc_norm_stderr": 0.032443052830087304 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.22900763358778625, "acc_stderr": 0.036853466317118506, "acc_norm": 0.22900763358778625, "acc_norm_stderr": 0.036853466317118506 }, "harness|hendrycksTest-international_law|5": { "acc": 0.256198347107438, "acc_stderr": 0.03984979653302871, "acc_norm": 0.256198347107438, "acc_norm_stderr": 0.03984979653302871 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.3055555555555556, "acc_stderr": 0.044531975073749834, "acc_norm": 0.3055555555555556, "acc_norm_stderr": 0.044531975073749834 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.25766871165644173, "acc_stderr": 0.03436150827846917, "acc_norm": 0.25766871165644173, "acc_norm_stderr": 0.03436150827846917 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.30357142857142855, "acc_stderr": 0.04364226155841044, "acc_norm": 0.30357142857142855, "acc_norm_stderr": 0.04364226155841044 }, "harness|hendrycksTest-management|5": { "acc": 0.2621359223300971, "acc_stderr": 0.04354631077260597, "acc_norm": 0.2621359223300971, "acc_norm_stderr": 0.04354631077260597 }, "harness|hendrycksTest-marketing|5": { "acc": 0.2564102564102564, "acc_stderr": 0.028605953702004253, "acc_norm": 0.2564102564102564, "acc_norm_stderr": 0.028605953702004253 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.26, "acc_stderr": 0.044084400227680794, "acc_norm": 0.26, "acc_norm_stderr": 0.044084400227680794 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.2886334610472541, "acc_stderr": 0.016203792703197804, "acc_norm": 0.2886334610472541, "acc_norm_stderr": 0.016203792703197804 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.24566473988439305, "acc_stderr": 0.02317629820399201, "acc_norm": 0.24566473988439305, "acc_norm_stderr": 0.02317629820399201 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.24804469273743016, "acc_stderr": 0.014444157808261445, "acc_norm": 0.24804469273743016, "acc_norm_stderr": 0.014444157808261445 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.24509803921568626, "acc_stderr": 0.024630048979824765, "acc_norm": 0.24509803921568626, "acc_norm_stderr": 0.024630048979824765 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.2733118971061093, "acc_stderr": 0.02531176597542612, "acc_norm": 0.2733118971061093, "acc_norm_stderr": 0.02531176597542612 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.2654320987654321, "acc_stderr": 0.024569223600460845, "acc_norm": 0.2654320987654321, "acc_norm_stderr": 0.024569223600460845 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.24822695035460993, "acc_stderr": 0.0257700156442904, "acc_norm": 0.24822695035460993, "acc_norm_stderr": 0.0257700156442904 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.23859191655801826, "acc_stderr": 0.0108859297420022, "acc_norm": 0.23859191655801826, "acc_norm_stderr": 0.0108859297420022 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.20220588235294118, "acc_stderr": 0.02439819298665492, "acc_norm": 0.20220588235294118, "acc_norm_stderr": 0.02439819298665492 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.25980392156862747, "acc_stderr": 0.01774089950917779, "acc_norm": 0.25980392156862747, "acc_norm_stderr": 0.01774089950917779 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.34545454545454546, "acc_stderr": 0.04554619617541054, "acc_norm": 0.34545454545454546, "acc_norm_stderr": 0.04554619617541054 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.16326530612244897, "acc_stderr": 0.023661699177098622, "acc_norm": 0.16326530612244897, "acc_norm_stderr": 0.023661699177098622 }, "harness|hendrycksTest-sociology|5": { "acc": 0.23880597014925373, "acc_stderr": 0.030147775935409224, "acc_norm": 0.23880597014925373, "acc_norm_stderr": 0.030147775935409224 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.22, "acc_stderr": 0.041633319989322695, "acc_norm": 0.22, "acc_norm_stderr": 0.041633319989322695 }, "harness|hendrycksTest-virology|5": { "acc": 0.3132530120481928, "acc_stderr": 0.036108050180310235, "acc_norm": 0.3132530120481928, "acc_norm_stderr": 0.036108050180310235 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.21052631578947367, "acc_stderr": 0.03126781714663179, "acc_norm": 0.21052631578947367, "acc_norm_stderr": 0.03126781714663179 }, "harness|truthfulqa:mc|0": { "mc1": 0.2350061199510404, "mc1_stderr": 0.014843061507731618, "mc2": 0.3758799861882878, "mc2_stderr": 0.014070883279660485 }, "harness|winogrande|5": { "acc": 0.6574585635359116, "acc_stderr": 0.013337483579075929 }, "harness|gsm8k|5": { "acc": 0.02577710386656558, "acc_stderr": 0.004365042953621804 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_indischepartij__TinyUltra-4x1.1B-Base-Alpha
[ "region:us" ]
2024-02-01T23:09:55+00:00
{"pretty_name": "Evaluation run of indischepartij/TinyUltra-4x1.1B-Base-Alpha", "dataset_summary": "Dataset automatically created during the evaluation run of model [indischepartij/TinyUltra-4x1.1B-Base-Alpha](https://huggingface.co/indischepartij/TinyUltra-4x1.1B-Base-Alpha) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_indischepartij__TinyUltra-4x1.1B-Base-Alpha\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-01T23:08:05.664341](https://huggingface.co/datasets/open-llm-leaderboard/details_indischepartij__TinyUltra-4x1.1B-Base-Alpha/blob/main/results_2024-02-01T23-08-05.664341.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.26201452650295837,\n \"acc_stderr\": 0.030950575098959248,\n \"acc_norm\": 0.26190159146597486,\n \"acc_norm_stderr\": 0.03169834440202644,\n \"mc1\": 0.2350061199510404,\n \"mc1_stderr\": 0.014843061507731618,\n \"mc2\": 0.3758799861882878,\n \"mc2_stderr\": 0.014070883279660485\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.3447098976109215,\n \"acc_stderr\": 0.01388881628678211,\n \"acc_norm\": 0.34897610921501704,\n \"acc_norm_stderr\": 0.013928933461382504\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.46594303923521213,\n \"acc_stderr\": 0.004978192893406287,\n \"acc_norm\": 0.6142202748456482,\n \"acc_norm_stderr\": 0.004857840934549174\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.2222222222222222,\n \"acc_stderr\": 0.035914440841969694,\n \"acc_norm\": 0.2222222222222222,\n \"acc_norm_stderr\": 0.035914440841969694\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.17763157894736842,\n \"acc_stderr\": 0.03110318238312338,\n \"acc_norm\": 0.17763157894736842,\n \"acc_norm_stderr\": 0.03110318238312338\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768079,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768079\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.27169811320754716,\n \"acc_stderr\": 0.027377706624670713,\n \"acc_norm\": 0.27169811320754716,\n \"acc_norm_stderr\": 0.027377706624670713\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2569444444444444,\n \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.2569444444444444,\n \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909282,\n \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909282\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.2138728323699422,\n \"acc_stderr\": 0.03126511206173043,\n \"acc_norm\": 0.2138728323699422,\n \"acc_norm_stderr\": 0.03126511206173043\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.19607843137254902,\n \"acc_stderr\": 0.03950581861179961,\n \"acc_norm\": 0.19607843137254902,\n \"acc_norm_stderr\": 0.03950581861179961\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.32340425531914896,\n \"acc_stderr\": 0.030579442773610334,\n \"acc_norm\": 0.32340425531914896,\n \"acc_norm_stderr\": 0.030579442773610334\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2631578947368421,\n \"acc_stderr\": 0.0414243971948936,\n \"acc_norm\": 0.2631578947368421,\n \"acc_norm_stderr\": 0.0414243971948936\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.23448275862068965,\n \"acc_stderr\": 0.035306258743465914,\n \"acc_norm\": 0.23448275862068965,\n \"acc_norm_stderr\": 0.035306258743465914\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.2566137566137566,\n \"acc_stderr\": 0.022494510767503154,\n \"acc_norm\": 0.2566137566137566,\n \"acc_norm_stderr\": 0.022494510767503154\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.20634920634920634,\n \"acc_stderr\": 0.036196045241242515,\n \"acc_norm\": 0.20634920634920634,\n \"acc_norm_stderr\": 0.036196045241242515\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.26129032258064516,\n \"acc_stderr\": 0.024993053397764826,\n \"acc_norm\": 0.26129032258064516,\n \"acc_norm_stderr\": 0.024993053397764826\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.27586206896551724,\n \"acc_stderr\": 0.031447125816782405,\n \"acc_norm\": 0.27586206896551724,\n \"acc_norm_stderr\": 0.031447125816782405\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816505,\n \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816505\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.24848484848484848,\n \"acc_stderr\": 0.03374402644139404,\n \"acc_norm\": 0.24848484848484848,\n \"acc_norm_stderr\": 0.03374402644139404\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.21717171717171718,\n \"acc_stderr\": 0.029376616484945637,\n \"acc_norm\": 0.21717171717171718,\n \"acc_norm_stderr\": 0.029376616484945637\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.22279792746113988,\n \"acc_stderr\": 0.03003114797764154,\n \"acc_norm\": 0.22279792746113988,\n \"acc_norm_stderr\": 0.03003114797764154\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.23333333333333334,\n \"acc_stderr\": 0.021444547301560486,\n \"acc_norm\": 0.23333333333333334,\n \"acc_norm_stderr\": 0.021444547301560486\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.26296296296296295,\n \"acc_stderr\": 0.026842057873833706,\n \"acc_norm\": 0.26296296296296295,\n \"acc_norm_stderr\": 0.026842057873833706\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.2689075630252101,\n \"acc_stderr\": 0.02880139219363128,\n \"acc_norm\": 0.2689075630252101,\n \"acc_norm_stderr\": 0.02880139219363128\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.2052980132450331,\n \"acc_stderr\": 0.03297986648473835,\n \"acc_norm\": 0.2052980132450331,\n \"acc_norm_stderr\": 0.03297986648473835\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.23669724770642203,\n \"acc_stderr\": 0.01822407811729908,\n \"acc_norm\": 0.23669724770642203,\n \"acc_norm_stderr\": 0.01822407811729908\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.35648148148148145,\n \"acc_stderr\": 0.032664783315272714,\n \"acc_norm\": 0.35648148148148145,\n \"acc_norm_stderr\": 0.032664783315272714\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.02933116229425173,\n \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.02933116229425173\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.28270042194092826,\n \"acc_stderr\": 0.029312814153955934,\n \"acc_norm\": 0.28270042194092826,\n \"acc_norm_stderr\": 0.029312814153955934\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.3721973094170404,\n \"acc_stderr\": 0.032443052830087304,\n \"acc_norm\": 0.3721973094170404,\n \"acc_norm_stderr\": 0.032443052830087304\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.22900763358778625,\n \"acc_stderr\": 0.036853466317118506,\n \"acc_norm\": 0.22900763358778625,\n \"acc_norm_stderr\": 0.036853466317118506\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.256198347107438,\n \"acc_stderr\": 0.03984979653302871,\n \"acc_norm\": 0.256198347107438,\n \"acc_norm_stderr\": 0.03984979653302871\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.3055555555555556,\n \"acc_stderr\": 0.044531975073749834,\n \"acc_norm\": 0.3055555555555556,\n \"acc_norm_stderr\": 0.044531975073749834\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.25766871165644173,\n \"acc_stderr\": 0.03436150827846917,\n \"acc_norm\": 0.25766871165644173,\n \"acc_norm_stderr\": 0.03436150827846917\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.30357142857142855,\n \"acc_stderr\": 0.04364226155841044,\n \"acc_norm\": 0.30357142857142855,\n \"acc_norm_stderr\": 0.04364226155841044\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.2621359223300971,\n \"acc_stderr\": 0.04354631077260597,\n \"acc_norm\": 0.2621359223300971,\n \"acc_norm_stderr\": 0.04354631077260597\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2564102564102564,\n \"acc_stderr\": 0.028605953702004253,\n \"acc_norm\": 0.2564102564102564,\n \"acc_norm_stderr\": 0.028605953702004253\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.044084400227680794,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.044084400227680794\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.2886334610472541,\n \"acc_stderr\": 0.016203792703197804,\n \"acc_norm\": 0.2886334610472541,\n \"acc_norm_stderr\": 0.016203792703197804\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.24566473988439305,\n \"acc_stderr\": 0.02317629820399201,\n \"acc_norm\": 0.24566473988439305,\n \"acc_norm_stderr\": 0.02317629820399201\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24804469273743016,\n \"acc_stderr\": 0.014444157808261445,\n \"acc_norm\": 0.24804469273743016,\n \"acc_norm_stderr\": 0.014444157808261445\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.24509803921568626,\n \"acc_stderr\": 0.024630048979824765,\n \"acc_norm\": 0.24509803921568626,\n \"acc_norm_stderr\": 0.024630048979824765\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2733118971061093,\n \"acc_stderr\": 0.02531176597542612,\n \"acc_norm\": 0.2733118971061093,\n \"acc_norm_stderr\": 0.02531176597542612\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.2654320987654321,\n \"acc_stderr\": 0.024569223600460845,\n \"acc_norm\": 0.2654320987654321,\n \"acc_norm_stderr\": 0.024569223600460845\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.24822695035460993,\n \"acc_stderr\": 0.0257700156442904,\n \"acc_norm\": 0.24822695035460993,\n \"acc_norm_stderr\": 0.0257700156442904\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.23859191655801826,\n \"acc_stderr\": 0.0108859297420022,\n \"acc_norm\": 0.23859191655801826,\n \"acc_norm_stderr\": 0.0108859297420022\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.20220588235294118,\n \"acc_stderr\": 0.02439819298665492,\n \"acc_norm\": 0.20220588235294118,\n \"acc_norm_stderr\": 0.02439819298665492\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.25980392156862747,\n \"acc_stderr\": 0.01774089950917779,\n \"acc_norm\": 0.25980392156862747,\n \"acc_norm_stderr\": 0.01774089950917779\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.34545454545454546,\n \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.34545454545454546,\n \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.16326530612244897,\n \"acc_stderr\": 0.023661699177098622,\n \"acc_norm\": 0.16326530612244897,\n \"acc_norm_stderr\": 0.023661699177098622\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.23880597014925373,\n \"acc_stderr\": 0.030147775935409224,\n \"acc_norm\": 0.23880597014925373,\n \"acc_norm_stderr\": 0.030147775935409224\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.22,\n \"acc_stderr\": 0.041633319989322695,\n \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.041633319989322695\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3132530120481928,\n \"acc_stderr\": 0.036108050180310235,\n \"acc_norm\": 0.3132530120481928,\n \"acc_norm_stderr\": 0.036108050180310235\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.21052631578947367,\n \"acc_stderr\": 0.03126781714663179,\n \"acc_norm\": 0.21052631578947367,\n \"acc_norm_stderr\": 0.03126781714663179\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2350061199510404,\n \"mc1_stderr\": 0.014843061507731618,\n \"mc2\": 0.3758799861882878,\n \"mc2_stderr\": 0.014070883279660485\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.6574585635359116,\n \"acc_stderr\": 0.013337483579075929\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.02577710386656558,\n \"acc_stderr\": 0.004365042953621804\n }\n}\n```", "repo_url": "https://huggingface.co/indischepartij/TinyUltra-4x1.1B-Base-Alpha", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_01T23_08_05.664341", "path": ["**/details_harness|arc:challenge|25_2024-02-01T23-08-05.664341.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-01T23-08-05.664341.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_01T23_08_05.664341", "path": ["**/details_harness|gsm8k|5_2024-02-01T23-08-05.664341.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-01T23-08-05.664341.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_01T23_08_05.664341", "path": ["**/details_harness|hellaswag|10_2024-02-01T23-08-05.664341.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-01T23-08-05.664341.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_01T23_08_05.664341", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T23-08-05.664341.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-01T23-08-05.664341.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-01T23-08-05.664341.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T23-08-05.664341.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T23-08-05.664341.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-01T23-08-05.664341.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T23-08-05.664341.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T23-08-05.664341.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T23-08-05.664341.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T23-08-05.664341.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-01T23-08-05.664341.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-01T23-08-05.664341.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T23-08-05.664341.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-01T23-08-05.664341.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T23-08-05.664341.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T23-08-05.664341.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T23-08-05.664341.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-01T23-08-05.664341.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T23-08-05.664341.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T23-08-05.664341.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T23-08-05.664341.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T23-08-05.664341.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T23-08-05.664341.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T23-08-05.664341.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T23-08-05.664341.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T23-08-05.664341.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T23-08-05.664341.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T23-08-05.664341.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T23-08-05.664341.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T23-08-05.664341.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T23-08-05.664341.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T23-08-05.664341.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-01T23-08-05.664341.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T23-08-05.664341.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-01T23-08-05.664341.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T23-08-05.664341.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T23-08-05.664341.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T23-08-05.664341.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-01T23-08-05.664341.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-01T23-08-05.664341.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T23-08-05.664341.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T23-08-05.664341.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T23-08-05.664341.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T23-08-05.664341.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-01T23-08-05.664341.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-01T23-08-05.664341.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-01T23-08-05.664341.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T23-08-05.664341.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-01T23-08-05.664341.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T23-08-05.664341.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T23-08-05.664341.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-01T23-08-05.664341.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-01T23-08-05.664341.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-01T23-08-05.664341.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T23-08-05.664341.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-01T23-08-05.664341.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-01T23-08-05.664341.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T23-08-05.664341.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-01T23-08-05.664341.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-01T23-08-05.664341.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T23-08-05.664341.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T23-08-05.664341.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-01T23-08-05.664341.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T23-08-05.664341.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T23-08-05.664341.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T23-08-05.664341.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T23-08-05.664341.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-01T23-08-05.664341.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-01T23-08-05.664341.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T23-08-05.664341.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-01T23-08-05.664341.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T23-08-05.664341.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T23-08-05.664341.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T23-08-05.664341.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-01T23-08-05.664341.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T23-08-05.664341.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T23-08-05.664341.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T23-08-05.664341.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T23-08-05.664341.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T23-08-05.664341.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T23-08-05.664341.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T23-08-05.664341.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T23-08-05.664341.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T23-08-05.664341.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T23-08-05.664341.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T23-08-05.664341.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T23-08-05.664341.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T23-08-05.664341.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T23-08-05.664341.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-01T23-08-05.664341.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T23-08-05.664341.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-01T23-08-05.664341.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T23-08-05.664341.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T23-08-05.664341.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T23-08-05.664341.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-01T23-08-05.664341.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-01T23-08-05.664341.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T23-08-05.664341.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T23-08-05.664341.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T23-08-05.664341.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T23-08-05.664341.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-01T23-08-05.664341.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-01T23-08-05.664341.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-01T23-08-05.664341.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T23-08-05.664341.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-01T23-08-05.664341.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T23-08-05.664341.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T23-08-05.664341.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-01T23-08-05.664341.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-01T23-08-05.664341.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-01T23-08-05.664341.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T23-08-05.664341.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-01T23-08-05.664341.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-01T23-08-05.664341.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_01T23_08_05.664341", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T23-08-05.664341.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T23-08-05.664341.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_01T23_08_05.664341", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-01T23-08-05.664341.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-01T23-08-05.664341.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_01T23_08_05.664341", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-01T23-08-05.664341.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-01T23-08-05.664341.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_01T23_08_05.664341", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T23-08-05.664341.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T23-08-05.664341.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_01T23_08_05.664341", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T23-08-05.664341.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T23-08-05.664341.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_01T23_08_05.664341", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-01T23-08-05.664341.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-01T23-08-05.664341.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_01T23_08_05.664341", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T23-08-05.664341.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T23-08-05.664341.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_01T23_08_05.664341", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T23-08-05.664341.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T23-08-05.664341.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_01T23_08_05.664341", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T23-08-05.664341.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T23-08-05.664341.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_01T23_08_05.664341", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T23-08-05.664341.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T23-08-05.664341.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_01T23_08_05.664341", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-01T23-08-05.664341.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-01T23-08-05.664341.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_01T23_08_05.664341", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-01T23-08-05.664341.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-01T23-08-05.664341.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_01T23_08_05.664341", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T23-08-05.664341.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T23-08-05.664341.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_01T23_08_05.664341", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-01T23-08-05.664341.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-01T23-08-05.664341.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_01T23_08_05.664341", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T23-08-05.664341.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T23-08-05.664341.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_01T23_08_05.664341", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T23-08-05.664341.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T23-08-05.664341.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_01T23_08_05.664341", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T23-08-05.664341.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T23-08-05.664341.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_01T23_08_05.664341", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-01T23-08-05.664341.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-01T23-08-05.664341.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_01T23_08_05.664341", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T23-08-05.664341.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T23-08-05.664341.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_01T23_08_05.664341", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T23-08-05.664341.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T23-08-05.664341.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_01T23_08_05.664341", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T23-08-05.664341.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T23-08-05.664341.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_01T23_08_05.664341", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T23-08-05.664341.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T23-08-05.664341.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_01T23_08_05.664341", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T23-08-05.664341.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T23-08-05.664341.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_01T23_08_05.664341", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T23-08-05.664341.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T23-08-05.664341.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_01T23_08_05.664341", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T23-08-05.664341.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T23-08-05.664341.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_01T23_08_05.664341", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T23-08-05.664341.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T23-08-05.664341.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_01T23_08_05.664341", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T23-08-05.664341.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T23-08-05.664341.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_01T23_08_05.664341", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T23-08-05.664341.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T23-08-05.664341.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_01T23_08_05.664341", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T23-08-05.664341.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T23-08-05.664341.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_01T23_08_05.664341", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T23-08-05.664341.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T23-08-05.664341.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_01T23_08_05.664341", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T23-08-05.664341.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T23-08-05.664341.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_01T23_08_05.664341", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T23-08-05.664341.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T23-08-05.664341.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_01T23_08_05.664341", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-01T23-08-05.664341.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-01T23-08-05.664341.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_01T23_08_05.664341", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T23-08-05.664341.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T23-08-05.664341.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_01T23_08_05.664341", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-01T23-08-05.664341.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-01T23-08-05.664341.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_01T23_08_05.664341", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T23-08-05.664341.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T23-08-05.664341.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_01T23_08_05.664341", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T23-08-05.664341.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T23-08-05.664341.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_01T23_08_05.664341", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T23-08-05.664341.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T23-08-05.664341.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_01T23_08_05.664341", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-01T23-08-05.664341.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-01T23-08-05.664341.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_01T23_08_05.664341", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-01T23-08-05.664341.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-01T23-08-05.664341.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_01T23_08_05.664341", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T23-08-05.664341.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T23-08-05.664341.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_01T23_08_05.664341", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T23-08-05.664341.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T23-08-05.664341.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_01T23_08_05.664341", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T23-08-05.664341.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T23-08-05.664341.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_01T23_08_05.664341", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T23-08-05.664341.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T23-08-05.664341.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_01T23_08_05.664341", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-01T23-08-05.664341.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-01T23-08-05.664341.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_01T23_08_05.664341", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-01T23-08-05.664341.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-01T23-08-05.664341.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_01T23_08_05.664341", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-01T23-08-05.664341.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-01T23-08-05.664341.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_01T23_08_05.664341", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T23-08-05.664341.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T23-08-05.664341.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_01T23_08_05.664341", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-01T23-08-05.664341.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-01T23-08-05.664341.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_01T23_08_05.664341", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T23-08-05.664341.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T23-08-05.664341.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_01T23_08_05.664341", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T23-08-05.664341.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T23-08-05.664341.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_01T23_08_05.664341", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-01T23-08-05.664341.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-01T23-08-05.664341.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_01T23_08_05.664341", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-01T23-08-05.664341.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-01T23-08-05.664341.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_01T23_08_05.664341", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-01T23-08-05.664341.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-01T23-08-05.664341.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_01T23_08_05.664341", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T23-08-05.664341.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T23-08-05.664341.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_01T23_08_05.664341", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-01T23-08-05.664341.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-01T23-08-05.664341.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_01T23_08_05.664341", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-01T23-08-05.664341.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-01T23-08-05.664341.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_01T23_08_05.664341", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-01T23-08-05.664341.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-01T23-08-05.664341.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_01T23_08_05.664341", "path": ["**/details_harness|winogrande|5_2024-02-01T23-08-05.664341.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-01T23-08-05.664341.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_01T23_08_05.664341", "path": ["results_2024-02-01T23-08-05.664341.parquet"]}, {"split": "latest", "path": ["results_2024-02-01T23-08-05.664341.parquet"]}]}]}
2024-02-01T23:10:18+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of indischepartij/TinyUltra-4x1.1B-Base-Alpha Dataset automatically created during the evaluation run of model indischepartij/TinyUltra-4x1.1B-Base-Alpha on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-02-01T23:08:05.664341(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of indischepartij/TinyUltra-4x1.1B-Base-Alpha\n\n\n\nDataset automatically created during the evaluation run of model indischepartij/TinyUltra-4x1.1B-Base-Alpha on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-02-01T23:08:05.664341(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of indischepartij/TinyUltra-4x1.1B-Base-Alpha\n\n\n\nDataset automatically created during the evaluation run of model indischepartij/TinyUltra-4x1.1B-Base-Alpha on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-02-01T23:08:05.664341(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
6ac08fad041ab154cb54d95f21255ab5c3ca42bf
# Dataset Card for SUBAK.KO ## Table of Contents - [Dataset Card for SUBAK.KO](#dataset-card-for-SUBAK.KO) - [Table of Contents](#table-of-contents) - [Dataset Description](#dataset-description) - [Dataset Summary](#dataset-summary) - [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards) - [Languages](#languages) - [Dataset Structure](#dataset-structure) - [Data Instances](#data-instances) - [Data Fields](#data-fields) - [Data Splits](#data-splits) - [Dataset Creation](#dataset-creation) - [Curation Rationale](#curation-rationale) - [Source Data](#source-data) - [Initial Data Collection and Normalization](#initial-data-collection-and-normalization) - [Who are the source language producers?](#who-are-the-source-language-producers) - [Annotations](#annotations) - [Annotation process](#annotation-process) - [Who are the annotators?](#who-are-the-annotators) - [Personal and Sensitive Information](#personal-and-sensitive-information) - [Considerations for Using the Data](#considerations-for-using-the-data) - [Social Impact of Dataset](#social-impact-of-dataset) - [Discussion of Biases](#discussion-of-biases) - [Other Known Limitations](#other-known-limitations) - [Additional Information](#additional-information) - [Dataset Curators](#dataset-curators) - [Licensing Information](#licensing-information) - [Citation Information](#citation-information) - [Contributions](#contributions) ## Dataset Description - **Developed By** Dept. of CSE, SUST, Bangladesh - **Paper:** [Bangladeshi Bangla speech corpus for automatic speech recognition research](https://www.sciencedirect.com/science/article/abs/pii/S0167639321001370) - **Point of Contact:** [Prof. Dr. M. Shahidur Rahman, Dept. of CSE, SUST](mailto:[email protected]) ### Dataset Summary SUBAK.KO (সুবাক্য), a publicly available annotated Bangladeshi standard Bangla speech corpus, is compiled for automatic speech recognition research. This corpus contains 241 hours of high-quality speech data, including 229 hours of read speech data and 12 hours of broadcast speech data. The read speech segment is recorded in a noise-proof studio environment from 33 male and 28 female native Bangladeshi Bangla speakers representing 8 divisions/34 districts of Bangladesh. Furthermore, the read speech segment comprises a total of 1 hour and 30 minutes of recorded speech provided by two second language (L2) speakers. The broadcast speech segment is collected from YouTube. SUBAK.KO has been manually annotated under human supervision to ensure gold-standard labels. The [corresponding paper](https://www.sciencedirect.com/science/article/abs/pii/S0167639321001370) reports detailed information about the development and baseline performance of SUBAK.KO and cross-dataset evaluation in comparison to [LB-ASRTD](https://openslr.org/53/) corpus. SUBAK.KO is developed by the researchers from the **Department of Computer Science and Engineering (CSE)** at **Shahjalal University of Science and Technology (SUST), Bangladesh** with financial support from the Higher Education Quality Enhancement Project (AIF Window 4, CP 3888) for “The Development of Multi-Platform Speech and Language Processing Software for Bangla” of the University Grants Commission (UGC), Bangladesh. ### Example Usage To load the full SUBAK.KO corpus, use the following code: ```python from datasets import load_dataset dataset = load_dataset("SUST-CSE-Speech/SUBAK.KO") ``` To load a specific split of the SUBAK.KO, define the split and set the streaming mode as True in the following way: ```python from datasets import load_dataset dataset = load_dataset("SUST-CSE-Speech/SUBAK.KO", split="test", streaming=True) ``` More documentation on streaming can be found [from this link.](https://huggingface.co/docs/datasets/stream#split-dataset) Alternatively, you can manually download the zipped SUBAK.KO folder from [this HuggingFace directory.](https://huggingface.co/datasets/ahnafsamin/SUBAK.KO/tree/main/Data) The csv files corresponding to the train, validation and test splits can be found in the same directory. ### Supported Tasks and Leaderboards This dataset is designed for the automatic speech recognition task. The associated paper provides the baseline results on SUBAK.KO corpus. ### Languages Bangladeshi standard Bangla ## Dataset Structure ### Data Instances A typical data point comprises the path to the audio file and its transcription. ``` { 'audio': {'path': '/home/username/subakko/part5/wav5/e4/TNM22_MESBA_page_257-258_5_5_Labeled_by_Tomal-20.wav', 'array': array([-0.00048828, -0.00018311, -0.00137329, ..., 0.00079346, 0.00091553, 0.00085449], dtype=float32), 'sampling_rate': 16000}, 'transcript': 'তারপর চার মাস তিনি ছিলেন কেন্দ্রীয় গোয়েন্দা সংস্থার তত্বাবধানে এক নিরাপদ জায়গায়', 'path': '/subakko/part5/wav5/e4/TNM22_MESBA_page_257-258_5_5_Labeled_by_Tomal-20.wav' } ``` ### Data Fields - audio: A dictionary containing the path to the original audio file, the decoded audio array, and the sampling rate. Note that when accessing the audio column: `dataset[0]["audio"]` the audio file is automatically decoded and resampled to `dataset.features["audio"].sampling_rate`. Decoding and resampling of a large number of audio files might take a significant amount of time. Thus it is important to first query the sample index before the `"audio"` column, *i.e.* `dataset[0]["audio"]` should **always** be preferred over `dataset["audio"][0]`. - transcription: The orthographic transcription - file_path: The relative path to the audio file ### Data Splits SUBAK.KO has been subdivided into three splits for train, validation and test. It is strongly advised to use identical data splits for research purposes to facilitate benchmarking across various models. | | Train | Validation | Test | | ---------------- | ---------|------------|----------| | Utterances | 64491 | 6594 | 6533 | | Duration | 200.3 hrs| 20.5 hrs | 20.3 hrs | ## Additional Information ### Licensing Information [CC BY 4.0](https://creativecommons.org/licenses/by/4.0/deed.en) ### Citation Information Please cite the following paper if you use the corpus. ``` @article{kibria2022bangladeshi, title={Bangladeshi Bangla speech corpus for automatic speech recognition research}, author={Kibria, Shafkat and Samin, Ahnaf Mozib and Kobir, M Humayon and Rahman, M Shahidur and Selim, M Reza and Iqbal, M Zafar}, journal={Speech Communication}, volume={136}, pages={84--97}, year={2022}, publisher={Elsevier} } ``` ### Contributions Thanks to [Ahnaf Mozib Samin](https://huggingface.co/ahnafsamin) for adding this dataset.
SUST-CSE-Speech/SUBAK.KO
[ "task_categories:automatic-speech-recognition", "size_categories:10K<n<100K", "language:bn", "license:cc-by-4.0", "speech-recognition", "Bangladeshi Bangla", "Bengali", "speech-corpus", "region:us" ]
2024-02-01T23:18:28+00:00
{"language": ["bn"], "license": "cc-by-4.0", "size_categories": ["10K<n<100K"], "task_categories": ["automatic-speech-recognition"], "dataset_info": {"features": [{"name": "audio", "dtype": "audio"}, {"name": "transcription", "dtype": "string"}, {"name": "file_path", "dtype": "string"}], "splits": [{"name": "test", "num_bytes": 2345138893.961, "num_examples": 6533}, {"name": "validation", "num_bytes": 2374606148.554, "num_examples": 6594}, {"name": "train", "num_bytes": 23111288170.312, "num_examples": 64491}], "download_size": 31898660522, "dataset_size": 27831033212.827}, "configs": [{"config_name": "default", "data_files": [{"split": "test", "path": "data/test-*"}, {"split": "validation", "path": "data/validation-*"}, {"split": "train", "path": "data/train-*"}]}], "tags": ["speech-recognition", "Bangladeshi Bangla", "Bengali", "speech-corpus"]}
2024-02-04T08:42:33+00:00
[]
[ "bn" ]
TAGS #task_categories-automatic-speech-recognition #size_categories-10K<n<100K #language-Bengali #license-cc-by-4.0 #speech-recognition #Bangladeshi Bangla #Bengali #speech-corpus #region-us
Dataset Card for SUBAK.KO ========================= Table of Contents ----------------- * Dataset Card for SUBAK.KO + Table of Contents + Dataset Description - Dataset Summary - Supported Tasks and Leaderboards - Languages + Dataset Structure - Data Instances - Data Fields - Data Splits + Dataset Creation - Curation Rationale - Source Data * Initial Data Collection and Normalization * Who are the source language producers? - Annotations * Annotation process * Who are the annotators? - Personal and Sensitive Information + Considerations for Using the Data - Social Impact of Dataset - Discussion of Biases - Other Known Limitations + Additional Information - Dataset Curators - Licensing Information - Citation Information - Contributions Dataset Description ------------------- * Developed By Dept. of CSE, SUST, Bangladesh * Paper: Bangladeshi Bangla speech corpus for automatic speech recognition research * Point of Contact: Prof. Dr. M. Shahidur Rahman, Dept. of CSE, SUST ### Dataset Summary SUBAK.KO (সুবাক্য), a publicly available annotated Bangladeshi standard Bangla speech corpus, is compiled for automatic speech recognition research. This corpus contains 241 hours of high-quality speech data, including 229 hours of read speech data and 12 hours of broadcast speech data. The read speech segment is recorded in a noise-proof studio environment from 33 male and 28 female native Bangladeshi Bangla speakers representing 8 divisions/34 districts of Bangladesh. Furthermore, the read speech segment comprises a total of 1 hour and 30 minutes of recorded speech provided by two second language (L2) speakers. The broadcast speech segment is collected from YouTube. SUBAK.KO has been manually annotated under human supervision to ensure gold-standard labels. The corresponding paper reports detailed information about the development and baseline performance of SUBAK.KO and cross-dataset evaluation in comparison to LB-ASRTD corpus. SUBAK.KO is developed by the researchers from the Department of Computer Science and Engineering (CSE) at Shahjalal University of Science and Technology (SUST), Bangladesh with financial support from the Higher Education Quality Enhancement Project (AIF Window 4, CP 3888) for “The Development of Multi-Platform Speech and Language Processing Software for Bangla” of the University Grants Commission (UGC), Bangladesh. ### Example Usage To load the full SUBAK.KO corpus, use the following code: To load a specific split of the SUBAK.KO, define the split and set the streaming mode as True in the following way: More documentation on streaming can be found from this link. Alternatively, you can manually download the zipped SUBAK.KO folder from this HuggingFace directory. The csv files corresponding to the train, validation and test splits can be found in the same directory. ### Supported Tasks and Leaderboards This dataset is designed for the automatic speech recognition task. The associated paper provides the baseline results on SUBAK.KO corpus. ### Languages Bangladeshi standard Bangla Dataset Structure ----------------- ### Data Instances A typical data point comprises the path to the audio file and its transcription. ### Data Fields * audio: A dictionary containing the path to the original audio file, the decoded audio array, and the sampling rate. Note that when accessing the audio column: 'dataset[0]["audio"]' the audio file is automatically decoded and resampled to 'dataset.features["audio"].sampling\_rate'. Decoding and resampling of a large number of audio files might take a significant amount of time. Thus it is important to first query the sample index before the '"audio"' column, *i.e.* 'dataset[0]["audio"]' should always be preferred over 'dataset["audio"][0]'. * transcription: The orthographic transcription * file\_path: The relative path to the audio file ### Data Splits SUBAK.KO has been subdivided into three splits for train, validation and test. It is strongly advised to use identical data splits for research purposes to facilitate benchmarking across various models. Additional Information ---------------------- ### Licensing Information CC BY 4.0 Please cite the following paper if you use the corpus. ### Contributions Thanks to Ahnaf Mozib Samin for adding this dataset.
[ "### Dataset Summary\n\n\nSUBAK.KO (সুবাক্য), a publicly available annotated Bangladeshi standard Bangla speech corpus, is compiled for automatic speech recognition research.\nThis corpus contains 241 hours of high-quality speech data, including 229 hours of read speech data and 12 hours of broadcast speech data.\nThe read speech segment is recorded in a noise-proof studio environment from 33 male and 28 female native Bangladeshi Bangla speakers\nrepresenting 8 divisions/34 districts of Bangladesh. Furthermore, the read speech segment comprises a total of 1 hour and 30 minutes\nof recorded speech provided by two second language (L2) speakers. The broadcast speech segment is collected from YouTube. SUBAK.KO has\nbeen manually annotated under human supervision to ensure gold-standard labels. The corresponding paper reports detailed information about\nthe development and baseline performance of SUBAK.KO and cross-dataset evaluation in comparison to LB-ASRTD corpus.\n\n\nSUBAK.KO is developed by the researchers from the Department of Computer Science and Engineering (CSE) at Shahjalal University of Science and Technology (SUST),\nBangladesh with financial support from the Higher Education Quality Enhancement Project (AIF Window 4, CP 3888) for “The Development of\nMulti-Platform Speech and Language Processing Software for Bangla” of the University Grants Commission (UGC), Bangladesh.", "### Example Usage\n\n\nTo load the full SUBAK.KO corpus, use the following code:\n\n\nTo load a specific split of the SUBAK.KO, define the split and set the streaming mode as True in the following way:\n\n\nMore documentation on streaming can be found from this link.\n\n\nAlternatively, you can manually download the zipped SUBAK.KO folder from this HuggingFace directory.\nThe csv files corresponding to the train, validation and test splits can be found in the same directory.", "### Supported Tasks and Leaderboards\n\n\nThis dataset is designed for the automatic speech recognition task. The associated paper provides the baseline results on SUBAK.KO corpus.", "### Languages\n\n\nBangladeshi standard Bangla\n\n\nDataset Structure\n-----------------", "### Data Instances\n\n\nA typical data point comprises the path to the audio file and its transcription.", "### Data Fields\n\n\n* audio: A dictionary containing the path to the original audio file, the decoded audio array, and the sampling rate. Note that when accessing the audio column: 'dataset[0][\"audio\"]' the audio file is automatically decoded and resampled to 'dataset.features[\"audio\"].sampling\\_rate'. Decoding and resampling of a large number of audio files might take a significant amount of time. Thus it is important to first query the sample index before the '\"audio\"' column, *i.e.* 'dataset[0][\"audio\"]' should always be preferred over 'dataset[\"audio\"][0]'.\n* transcription: The orthographic transcription\n* file\\_path: The relative path to the audio file", "### Data Splits\n\n\nSUBAK.KO has been subdivided into three splits for train, validation and test. It is strongly advised to use identical data splits\nfor research purposes to facilitate benchmarking across various models.\n\n\n\nAdditional Information\n----------------------", "### Licensing Information\n\n\nCC BY 4.0\n\n\nPlease cite the following paper if you use the corpus.", "### Contributions\n\n\nThanks to Ahnaf Mozib Samin for adding this dataset." ]
[ "TAGS\n#task_categories-automatic-speech-recognition #size_categories-10K<n<100K #language-Bengali #license-cc-by-4.0 #speech-recognition #Bangladeshi Bangla #Bengali #speech-corpus #region-us \n", "### Dataset Summary\n\n\nSUBAK.KO (সুবাক্য), a publicly available annotated Bangladeshi standard Bangla speech corpus, is compiled for automatic speech recognition research.\nThis corpus contains 241 hours of high-quality speech data, including 229 hours of read speech data and 12 hours of broadcast speech data.\nThe read speech segment is recorded in a noise-proof studio environment from 33 male and 28 female native Bangladeshi Bangla speakers\nrepresenting 8 divisions/34 districts of Bangladesh. Furthermore, the read speech segment comprises a total of 1 hour and 30 minutes\nof recorded speech provided by two second language (L2) speakers. The broadcast speech segment is collected from YouTube. SUBAK.KO has\nbeen manually annotated under human supervision to ensure gold-standard labels. The corresponding paper reports detailed information about\nthe development and baseline performance of SUBAK.KO and cross-dataset evaluation in comparison to LB-ASRTD corpus.\n\n\nSUBAK.KO is developed by the researchers from the Department of Computer Science and Engineering (CSE) at Shahjalal University of Science and Technology (SUST),\nBangladesh with financial support from the Higher Education Quality Enhancement Project (AIF Window 4, CP 3888) for “The Development of\nMulti-Platform Speech and Language Processing Software for Bangla” of the University Grants Commission (UGC), Bangladesh.", "### Example Usage\n\n\nTo load the full SUBAK.KO corpus, use the following code:\n\n\nTo load a specific split of the SUBAK.KO, define the split and set the streaming mode as True in the following way:\n\n\nMore documentation on streaming can be found from this link.\n\n\nAlternatively, you can manually download the zipped SUBAK.KO folder from this HuggingFace directory.\nThe csv files corresponding to the train, validation and test splits can be found in the same directory.", "### Supported Tasks and Leaderboards\n\n\nThis dataset is designed for the automatic speech recognition task. The associated paper provides the baseline results on SUBAK.KO corpus.", "### Languages\n\n\nBangladeshi standard Bangla\n\n\nDataset Structure\n-----------------", "### Data Instances\n\n\nA typical data point comprises the path to the audio file and its transcription.", "### Data Fields\n\n\n* audio: A dictionary containing the path to the original audio file, the decoded audio array, and the sampling rate. Note that when accessing the audio column: 'dataset[0][\"audio\"]' the audio file is automatically decoded and resampled to 'dataset.features[\"audio\"].sampling\\_rate'. Decoding and resampling of a large number of audio files might take a significant amount of time. Thus it is important to first query the sample index before the '\"audio\"' column, *i.e.* 'dataset[0][\"audio\"]' should always be preferred over 'dataset[\"audio\"][0]'.\n* transcription: The orthographic transcription\n* file\\_path: The relative path to the audio file", "### Data Splits\n\n\nSUBAK.KO has been subdivided into three splits for train, validation and test. It is strongly advised to use identical data splits\nfor research purposes to facilitate benchmarking across various models.\n\n\n\nAdditional Information\n----------------------", "### Licensing Information\n\n\nCC BY 4.0\n\n\nPlease cite the following paper if you use the corpus.", "### Contributions\n\n\nThanks to Ahnaf Mozib Samin for adding this dataset." ]
2b756d30d5e741bf9cf5a65f4bbbb79f5bae5bc5
# Dataset Card for Evaluation run of KnutJaegersberg/YaYi-30b-EverythingLM <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [KnutJaegersberg/YaYi-30b-EverythingLM](https://huggingface.co/KnutJaegersberg/YaYi-30b-EverythingLM) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_KnutJaegersberg__YaYi-30b-EverythingLM", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-02-01T23:16:21.173986](https://huggingface.co/datasets/open-llm-leaderboard/details_KnutJaegersberg__YaYi-30b-EverythingLM/blob/main/results_2024-02-01T23-16-21.173986.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6767860816427482, "acc_stderr": 0.03218516670791061, "acc_norm": 0.6894497700980339, "acc_norm_stderr": 0.032885991003254615, "mc1": 0.3378212974296206, "mc1_stderr": 0.016557167322516872, "mc2": 0.4973644577114843, "mc2_stderr": 0.01544476842939492 }, "harness|arc:challenge|25": { "acc": 0.35238907849829354, "acc_stderr": 0.01396014260059868, "acc_norm": 0.3796928327645051, "acc_norm_stderr": 0.014182119866974872 }, "harness|hellaswag|10": { "acc": 0.47649870543716394, "acc_stderr": 0.004984266543053121, "acc_norm": 0.6105357498506274, "acc_norm_stderr": 0.004866322258335992 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.39, "acc_stderr": 0.04902071300001975, "acc_norm": 0.39, "acc_norm_stderr": 0.04902071300001975 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.6592592592592592, "acc_stderr": 0.040943762699967946, "acc_norm": 0.6592592592592592, "acc_norm_stderr": 0.040943762699967946 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.7236842105263158, "acc_stderr": 0.03639057569952929, "acc_norm": 0.7236842105263158, "acc_norm_stderr": 0.03639057569952929 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.74, "acc_stderr": 0.04408440022768077, "acc_norm": 0.74, "acc_norm_stderr": 0.04408440022768077 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.720754716981132, "acc_stderr": 0.027611163402399715, "acc_norm": 0.720754716981132, "acc_norm_stderr": 0.027611163402399715 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.6666666666666666, "acc_stderr": 0.039420826399272135, "acc_norm": 0.6666666666666666, "acc_norm_stderr": 0.039420826399272135 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.64, "acc_stderr": 0.048241815132442176, "acc_norm": 0.64, "acc_norm_stderr": 0.048241815132442176 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.68, "acc_stderr": 0.046882617226215034, "acc_norm": 0.68, "acc_norm_stderr": 0.046882617226215034 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.56, "acc_stderr": 0.04988876515698589, "acc_norm": 0.56, "acc_norm_stderr": 0.04988876515698589 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6878612716763006, "acc_stderr": 0.035331333893236574, "acc_norm": 0.6878612716763006, "acc_norm_stderr": 0.035331333893236574 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.6470588235294118, "acc_stderr": 0.04755129616062947, "acc_norm": 0.6470588235294118, "acc_norm_stderr": 0.04755129616062947 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.77, "acc_stderr": 0.04229525846816508, "acc_norm": 0.77, "acc_norm_stderr": 0.04229525846816508 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.6936170212765957, "acc_stderr": 0.03013590647851756, "acc_norm": 0.6936170212765957, "acc_norm_stderr": 0.03013590647851756 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.5964912280701754, "acc_stderr": 0.04615186962583706, "acc_norm": 0.5964912280701754, "acc_norm_stderr": 0.04615186962583706 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.696551724137931, "acc_stderr": 0.038312260488503336, "acc_norm": 0.696551724137931, "acc_norm_stderr": 0.038312260488503336 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.6164021164021164, "acc_stderr": 0.0250437573185202, "acc_norm": 0.6164021164021164, "acc_norm_stderr": 0.0250437573185202 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.5634920634920635, "acc_stderr": 0.04435932892851466, "acc_norm": 0.5634920634920635, "acc_norm_stderr": 0.04435932892851466 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.52, "acc_stderr": 0.050211673156867795, "acc_norm": 0.52, "acc_norm_stderr": 0.050211673156867795 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.7225806451612903, "acc_stderr": 0.025470196835900055, "acc_norm": 0.7225806451612903, "acc_norm_stderr": 0.025470196835900055 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.6748768472906403, "acc_stderr": 0.03295797566311271, "acc_norm": 0.6748768472906403, "acc_norm_stderr": 0.03295797566311271 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.75, "acc_stderr": 0.04351941398892446, "acc_norm": 0.75, "acc_norm_stderr": 0.04351941398892446 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7090909090909091, "acc_stderr": 0.03546563019624336, "acc_norm": 0.7090909090909091, "acc_norm_stderr": 0.03546563019624336 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.8333333333333334, "acc_stderr": 0.026552207828215293, "acc_norm": 0.8333333333333334, "acc_norm_stderr": 0.026552207828215293 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.7927461139896373, "acc_stderr": 0.02925282329180363, "acc_norm": 0.7927461139896373, "acc_norm_stderr": 0.02925282329180363 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.7102564102564103, "acc_stderr": 0.023000628243687964, "acc_norm": 0.7102564102564103, "acc_norm_stderr": 0.023000628243687964 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.5444444444444444, "acc_stderr": 0.03036486250482443, "acc_norm": 0.5444444444444444, "acc_norm_stderr": 0.03036486250482443 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.773109243697479, "acc_stderr": 0.02720537153827948, "acc_norm": 0.773109243697479, "acc_norm_stderr": 0.02720537153827948 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.6622516556291391, "acc_stderr": 0.038615575462551684, "acc_norm": 0.6622516556291391, "acc_norm_stderr": 0.038615575462551684 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.7376146788990826, "acc_stderr": 0.018861885021534745, "acc_norm": 0.7376146788990826, "acc_norm_stderr": 0.018861885021534745 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.6898148148148148, "acc_stderr": 0.03154696285656628, "acc_norm": 0.6898148148148148, "acc_norm_stderr": 0.03154696285656628 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.6911764705882353, "acc_stderr": 0.03242661719827218, "acc_norm": 0.6911764705882353, "acc_norm_stderr": 0.03242661719827218 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.8354430379746836, "acc_stderr": 0.024135736240566932, "acc_norm": 0.8354430379746836, "acc_norm_stderr": 0.024135736240566932 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.7668161434977578, "acc_stderr": 0.028380391147094716, "acc_norm": 0.7668161434977578, "acc_norm_stderr": 0.028380391147094716 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.6946564885496184, "acc_stderr": 0.040393149787245605, "acc_norm": 0.6946564885496184, "acc_norm_stderr": 0.040393149787245605 }, "harness|hendrycksTest-international_law|5": { "acc": 0.8347107438016529, "acc_stderr": 0.03390780612972776, "acc_norm": 0.8347107438016529, "acc_norm_stderr": 0.03390780612972776 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.8055555555555556, "acc_stderr": 0.03826076324884866, "acc_norm": 0.8055555555555556, "acc_norm_stderr": 0.03826076324884866 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.6871165644171779, "acc_stderr": 0.036429145782924055, "acc_norm": 0.6871165644171779, "acc_norm_stderr": 0.036429145782924055 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.48214285714285715, "acc_stderr": 0.047427623612430116, "acc_norm": 0.48214285714285715, "acc_norm_stderr": 0.047427623612430116 }, "harness|hendrycksTest-management|5": { "acc": 0.7669902912621359, "acc_stderr": 0.04185832598928315, "acc_norm": 0.7669902912621359, "acc_norm_stderr": 0.04185832598928315 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8205128205128205, "acc_stderr": 0.02514093595033544, "acc_norm": 0.8205128205128205, "acc_norm_stderr": 0.02514093595033544 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.65, "acc_stderr": 0.0479372485441102, "acc_norm": 0.65, "acc_norm_stderr": 0.0479372485441102 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.7458492975734355, "acc_stderr": 0.01556925469204576, "acc_norm": 0.7458492975734355, "acc_norm_stderr": 0.01556925469204576 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.7514450867052023, "acc_stderr": 0.023267528432100174, "acc_norm": 0.7514450867052023, "acc_norm_stderr": 0.023267528432100174 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.5128491620111731, "acc_stderr": 0.016716978838043534, "acc_norm": 0.5128491620111731, "acc_norm_stderr": 0.016716978838043534 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.7254901960784313, "acc_stderr": 0.025553169991826514, "acc_norm": 0.7254901960784313, "acc_norm_stderr": 0.025553169991826514 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.77491961414791, "acc_stderr": 0.02372008851617903, "acc_norm": 0.77491961414791, "acc_norm_stderr": 0.02372008851617903 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.7438271604938271, "acc_stderr": 0.0242885336377261, "acc_norm": 0.7438271604938271, "acc_norm_stderr": 0.0242885336377261 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.6382978723404256, "acc_stderr": 0.028663820147199492, "acc_norm": 0.6382978723404256, "acc_norm_stderr": 0.028663820147199492 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.6382007822685789, "acc_stderr": 0.012272736233262943, "acc_norm": 0.6382007822685789, "acc_norm_stderr": 0.012272736233262943 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.6948529411764706, "acc_stderr": 0.027971541370170598, "acc_norm": 0.6948529411764706, "acc_norm_stderr": 0.027971541370170598 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.7058823529411765, "acc_stderr": 0.018433427649401896, "acc_norm": 0.7058823529411765, "acc_norm_stderr": 0.018433427649401896 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.7909090909090909, "acc_stderr": 0.038950910157241364, "acc_norm": 0.7909090909090909, "acc_norm_stderr": 0.038950910157241364 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.7673469387755102, "acc_stderr": 0.02704925791589618, "acc_norm": 0.7673469387755102, "acc_norm_stderr": 0.02704925791589618 }, "harness|hendrycksTest-sociology|5": { "acc": 0.8557213930348259, "acc_stderr": 0.024845753212306046, "acc_norm": 0.8557213930348259, "acc_norm_stderr": 0.024845753212306046 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.8, "acc_stderr": 0.04020151261036846, "acc_norm": 0.8, "acc_norm_stderr": 0.04020151261036846 }, "harness|hendrycksTest-virology|5": { "acc": 0.6445783132530121, "acc_stderr": 0.03726214354322415, "acc_norm": 0.6445783132530121, "acc_norm_stderr": 0.03726214354322415 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.6900584795321637, "acc_stderr": 0.03546976959393163, "acc_norm": 0.6900584795321637, "acc_norm_stderr": 0.03546976959393163 }, "harness|truthfulqa:mc|0": { "mc1": 0.3378212974296206, "mc1_stderr": 0.016557167322516872, "mc2": 0.4973644577114843, "mc2_stderr": 0.01544476842939492 }, "harness|winogrande|5": { "acc": 0.6282557221783741, "acc_stderr": 0.013582306284992877 }, "harness|gsm8k|5": { "acc": 0.13949962092494314, "acc_stderr": 0.009543426687191287 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_KnutJaegersberg__YaYi-30b-EverythingLM
[ "region:us" ]
2024-02-01T23:18:41+00:00
{"pretty_name": "Evaluation run of KnutJaegersberg/YaYi-30b-EverythingLM", "dataset_summary": "Dataset automatically created during the evaluation run of model [KnutJaegersberg/YaYi-30b-EverythingLM](https://huggingface.co/KnutJaegersberg/YaYi-30b-EverythingLM) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_KnutJaegersberg__YaYi-30b-EverythingLM\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-01T23:16:21.173986](https://huggingface.co/datasets/open-llm-leaderboard/details_KnutJaegersberg__YaYi-30b-EverythingLM/blob/main/results_2024-02-01T23-16-21.173986.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6767860816427482,\n \"acc_stderr\": 0.03218516670791061,\n \"acc_norm\": 0.6894497700980339,\n \"acc_norm_stderr\": 0.032885991003254615,\n \"mc1\": 0.3378212974296206,\n \"mc1_stderr\": 0.016557167322516872,\n \"mc2\": 0.4973644577114843,\n \"mc2_stderr\": 0.01544476842939492\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.35238907849829354,\n \"acc_stderr\": 0.01396014260059868,\n \"acc_norm\": 0.3796928327645051,\n \"acc_norm_stderr\": 0.014182119866974872\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.47649870543716394,\n \"acc_stderr\": 0.004984266543053121,\n \"acc_norm\": 0.6105357498506274,\n \"acc_norm_stderr\": 0.004866322258335992\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6592592592592592,\n \"acc_stderr\": 0.040943762699967946,\n \"acc_norm\": 0.6592592592592592,\n \"acc_norm_stderr\": 0.040943762699967946\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7236842105263158,\n \"acc_stderr\": 0.03639057569952929,\n \"acc_norm\": 0.7236842105263158,\n \"acc_norm_stderr\": 0.03639057569952929\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768077,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768077\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.720754716981132,\n \"acc_stderr\": 0.027611163402399715,\n \"acc_norm\": 0.720754716981132,\n \"acc_norm_stderr\": 0.027611163402399715\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.039420826399272135,\n \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.039420826399272135\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.64,\n \"acc_stderr\": 0.048241815132442176,\n \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.048241815132442176\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.68,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6878612716763006,\n \"acc_stderr\": 0.035331333893236574,\n \"acc_norm\": 0.6878612716763006,\n \"acc_norm_stderr\": 0.035331333893236574\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.6470588235294118,\n \"acc_stderr\": 0.04755129616062947,\n \"acc_norm\": 0.6470588235294118,\n \"acc_norm_stderr\": 0.04755129616062947\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.77,\n \"acc_stderr\": 0.04229525846816508,\n \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.04229525846816508\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.6936170212765957,\n \"acc_stderr\": 0.03013590647851756,\n \"acc_norm\": 0.6936170212765957,\n \"acc_norm_stderr\": 0.03013590647851756\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5964912280701754,\n \"acc_stderr\": 0.04615186962583706,\n \"acc_norm\": 0.5964912280701754,\n \"acc_norm_stderr\": 0.04615186962583706\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.696551724137931,\n \"acc_stderr\": 0.038312260488503336,\n \"acc_norm\": 0.696551724137931,\n \"acc_norm_stderr\": 0.038312260488503336\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.6164021164021164,\n \"acc_stderr\": 0.0250437573185202,\n \"acc_norm\": 0.6164021164021164,\n \"acc_norm_stderr\": 0.0250437573185202\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5634920634920635,\n \"acc_stderr\": 0.04435932892851466,\n \"acc_norm\": 0.5634920634920635,\n \"acc_norm_stderr\": 0.04435932892851466\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7225806451612903,\n \"acc_stderr\": 0.025470196835900055,\n \"acc_norm\": 0.7225806451612903,\n \"acc_norm_stderr\": 0.025470196835900055\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.6748768472906403,\n \"acc_stderr\": 0.03295797566311271,\n \"acc_norm\": 0.6748768472906403,\n \"acc_norm_stderr\": 0.03295797566311271\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7090909090909091,\n \"acc_stderr\": 0.03546563019624336,\n \"acc_norm\": 0.7090909090909091,\n \"acc_norm_stderr\": 0.03546563019624336\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.8333333333333334,\n \"acc_stderr\": 0.026552207828215293,\n \"acc_norm\": 0.8333333333333334,\n \"acc_norm_stderr\": 0.026552207828215293\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.7927461139896373,\n \"acc_stderr\": 0.02925282329180363,\n \"acc_norm\": 0.7927461139896373,\n \"acc_norm_stderr\": 0.02925282329180363\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.7102564102564103,\n \"acc_stderr\": 0.023000628243687964,\n \"acc_norm\": 0.7102564102564103,\n \"acc_norm_stderr\": 0.023000628243687964\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.5444444444444444,\n \"acc_stderr\": 0.03036486250482443,\n \"acc_norm\": 0.5444444444444444,\n \"acc_norm_stderr\": 0.03036486250482443\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.773109243697479,\n \"acc_stderr\": 0.02720537153827948,\n \"acc_norm\": 0.773109243697479,\n \"acc_norm_stderr\": 0.02720537153827948\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.6622516556291391,\n \"acc_stderr\": 0.038615575462551684,\n \"acc_norm\": 0.6622516556291391,\n \"acc_norm_stderr\": 0.038615575462551684\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.7376146788990826,\n \"acc_stderr\": 0.018861885021534745,\n \"acc_norm\": 0.7376146788990826,\n \"acc_norm_stderr\": 0.018861885021534745\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.6898148148148148,\n \"acc_stderr\": 0.03154696285656628,\n \"acc_norm\": 0.6898148148148148,\n \"acc_norm_stderr\": 0.03154696285656628\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.6911764705882353,\n \"acc_stderr\": 0.03242661719827218,\n \"acc_norm\": 0.6911764705882353,\n \"acc_norm_stderr\": 0.03242661719827218\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8354430379746836,\n \"acc_stderr\": 0.024135736240566932,\n \"acc_norm\": 0.8354430379746836,\n \"acc_norm_stderr\": 0.024135736240566932\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7668161434977578,\n \"acc_stderr\": 0.028380391147094716,\n \"acc_norm\": 0.7668161434977578,\n \"acc_norm_stderr\": 0.028380391147094716\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.6946564885496184,\n \"acc_stderr\": 0.040393149787245605,\n \"acc_norm\": 0.6946564885496184,\n \"acc_norm_stderr\": 0.040393149787245605\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8347107438016529,\n \"acc_stderr\": 0.03390780612972776,\n \"acc_norm\": 0.8347107438016529,\n \"acc_norm_stderr\": 0.03390780612972776\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8055555555555556,\n \"acc_stderr\": 0.03826076324884866,\n \"acc_norm\": 0.8055555555555556,\n \"acc_norm_stderr\": 0.03826076324884866\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.6871165644171779,\n \"acc_stderr\": 0.036429145782924055,\n \"acc_norm\": 0.6871165644171779,\n \"acc_norm_stderr\": 0.036429145782924055\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.48214285714285715,\n \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.48214285714285715,\n \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8205128205128205,\n \"acc_stderr\": 0.02514093595033544,\n \"acc_norm\": 0.8205128205128205,\n \"acc_norm_stderr\": 0.02514093595033544\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.65,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7458492975734355,\n \"acc_stderr\": 0.01556925469204576,\n \"acc_norm\": 0.7458492975734355,\n \"acc_norm_stderr\": 0.01556925469204576\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7514450867052023,\n \"acc_stderr\": 0.023267528432100174,\n \"acc_norm\": 0.7514450867052023,\n \"acc_norm_stderr\": 0.023267528432100174\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.5128491620111731,\n \"acc_stderr\": 0.016716978838043534,\n \"acc_norm\": 0.5128491620111731,\n \"acc_norm_stderr\": 0.016716978838043534\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7254901960784313,\n \"acc_stderr\": 0.025553169991826514,\n \"acc_norm\": 0.7254901960784313,\n \"acc_norm_stderr\": 0.025553169991826514\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.77491961414791,\n \"acc_stderr\": 0.02372008851617903,\n \"acc_norm\": 0.77491961414791,\n \"acc_norm_stderr\": 0.02372008851617903\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7438271604938271,\n \"acc_stderr\": 0.0242885336377261,\n \"acc_norm\": 0.7438271604938271,\n \"acc_norm_stderr\": 0.0242885336377261\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.6382978723404256,\n \"acc_stderr\": 0.028663820147199492,\n \"acc_norm\": 0.6382978723404256,\n \"acc_norm_stderr\": 0.028663820147199492\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.6382007822685789,\n \"acc_stderr\": 0.012272736233262943,\n \"acc_norm\": 0.6382007822685789,\n \"acc_norm_stderr\": 0.012272736233262943\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6948529411764706,\n \"acc_stderr\": 0.027971541370170598,\n \"acc_norm\": 0.6948529411764706,\n \"acc_norm_stderr\": 0.027971541370170598\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.7058823529411765,\n \"acc_stderr\": 0.018433427649401896,\n \"acc_norm\": 0.7058823529411765,\n \"acc_norm_stderr\": 0.018433427649401896\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7909090909090909,\n \"acc_stderr\": 0.038950910157241364,\n \"acc_norm\": 0.7909090909090909,\n \"acc_norm_stderr\": 0.038950910157241364\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7673469387755102,\n \"acc_stderr\": 0.02704925791589618,\n \"acc_norm\": 0.7673469387755102,\n \"acc_norm_stderr\": 0.02704925791589618\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8557213930348259,\n \"acc_stderr\": 0.024845753212306046,\n \"acc_norm\": 0.8557213930348259,\n \"acc_norm_stderr\": 0.024845753212306046\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.8,\n \"acc_stderr\": 0.04020151261036846,\n \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.04020151261036846\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.6445783132530121,\n \"acc_stderr\": 0.03726214354322415,\n \"acc_norm\": 0.6445783132530121,\n \"acc_norm_stderr\": 0.03726214354322415\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.6900584795321637,\n \"acc_stderr\": 0.03546976959393163,\n \"acc_norm\": 0.6900584795321637,\n \"acc_norm_stderr\": 0.03546976959393163\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3378212974296206,\n \"mc1_stderr\": 0.016557167322516872,\n \"mc2\": 0.4973644577114843,\n \"mc2_stderr\": 0.01544476842939492\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.6282557221783741,\n \"acc_stderr\": 0.013582306284992877\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.13949962092494314,\n \"acc_stderr\": 0.009543426687191287\n }\n}\n```", "repo_url": "https://huggingface.co/KnutJaegersberg/YaYi-30b-EverythingLM", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_01T23_16_21.173986", "path": ["**/details_harness|arc:challenge|25_2024-02-01T23-16-21.173986.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-01T23-16-21.173986.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_01T23_16_21.173986", "path": ["**/details_harness|gsm8k|5_2024-02-01T23-16-21.173986.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-01T23-16-21.173986.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_01T23_16_21.173986", "path": ["**/details_harness|hellaswag|10_2024-02-01T23-16-21.173986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-01T23-16-21.173986.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_01T23_16_21.173986", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T23-16-21.173986.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-01T23-16-21.173986.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-01T23-16-21.173986.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T23-16-21.173986.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T23-16-21.173986.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-01T23-16-21.173986.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T23-16-21.173986.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T23-16-21.173986.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T23-16-21.173986.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T23-16-21.173986.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-01T23-16-21.173986.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-01T23-16-21.173986.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T23-16-21.173986.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-01T23-16-21.173986.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T23-16-21.173986.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T23-16-21.173986.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T23-16-21.173986.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-01T23-16-21.173986.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T23-16-21.173986.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T23-16-21.173986.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T23-16-21.173986.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T23-16-21.173986.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T23-16-21.173986.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T23-16-21.173986.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T23-16-21.173986.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T23-16-21.173986.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T23-16-21.173986.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T23-16-21.173986.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T23-16-21.173986.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T23-16-21.173986.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T23-16-21.173986.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T23-16-21.173986.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-01T23-16-21.173986.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T23-16-21.173986.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-01T23-16-21.173986.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T23-16-21.173986.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T23-16-21.173986.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T23-16-21.173986.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-01T23-16-21.173986.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-01T23-16-21.173986.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T23-16-21.173986.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T23-16-21.173986.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T23-16-21.173986.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T23-16-21.173986.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-01T23-16-21.173986.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-01T23-16-21.173986.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-01T23-16-21.173986.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T23-16-21.173986.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-01T23-16-21.173986.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T23-16-21.173986.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T23-16-21.173986.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-01T23-16-21.173986.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-01T23-16-21.173986.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-01T23-16-21.173986.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T23-16-21.173986.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-01T23-16-21.173986.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-01T23-16-21.173986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T23-16-21.173986.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-01T23-16-21.173986.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-01T23-16-21.173986.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T23-16-21.173986.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T23-16-21.173986.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-01T23-16-21.173986.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T23-16-21.173986.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T23-16-21.173986.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T23-16-21.173986.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T23-16-21.173986.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-01T23-16-21.173986.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-01T23-16-21.173986.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T23-16-21.173986.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-01T23-16-21.173986.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T23-16-21.173986.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T23-16-21.173986.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T23-16-21.173986.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-01T23-16-21.173986.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T23-16-21.173986.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T23-16-21.173986.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T23-16-21.173986.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T23-16-21.173986.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T23-16-21.173986.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T23-16-21.173986.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T23-16-21.173986.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T23-16-21.173986.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T23-16-21.173986.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T23-16-21.173986.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T23-16-21.173986.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T23-16-21.173986.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T23-16-21.173986.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T23-16-21.173986.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-01T23-16-21.173986.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T23-16-21.173986.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-01T23-16-21.173986.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T23-16-21.173986.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T23-16-21.173986.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T23-16-21.173986.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-01T23-16-21.173986.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-01T23-16-21.173986.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T23-16-21.173986.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T23-16-21.173986.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T23-16-21.173986.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T23-16-21.173986.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-01T23-16-21.173986.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-01T23-16-21.173986.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-01T23-16-21.173986.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T23-16-21.173986.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-01T23-16-21.173986.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T23-16-21.173986.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T23-16-21.173986.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-01T23-16-21.173986.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-01T23-16-21.173986.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-01T23-16-21.173986.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T23-16-21.173986.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-01T23-16-21.173986.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-01T23-16-21.173986.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_01T23_16_21.173986", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T23-16-21.173986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T23-16-21.173986.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_01T23_16_21.173986", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-01T23-16-21.173986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-01T23-16-21.173986.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_01T23_16_21.173986", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-01T23-16-21.173986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-01T23-16-21.173986.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_01T23_16_21.173986", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T23-16-21.173986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T23-16-21.173986.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_01T23_16_21.173986", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T23-16-21.173986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T23-16-21.173986.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_01T23_16_21.173986", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-01T23-16-21.173986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-01T23-16-21.173986.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_01T23_16_21.173986", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T23-16-21.173986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T23-16-21.173986.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_01T23_16_21.173986", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T23-16-21.173986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T23-16-21.173986.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_01T23_16_21.173986", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T23-16-21.173986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T23-16-21.173986.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_01T23_16_21.173986", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T23-16-21.173986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T23-16-21.173986.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_01T23_16_21.173986", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-01T23-16-21.173986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-01T23-16-21.173986.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_01T23_16_21.173986", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-01T23-16-21.173986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-01T23-16-21.173986.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_01T23_16_21.173986", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T23-16-21.173986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T23-16-21.173986.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_01T23_16_21.173986", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-01T23-16-21.173986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-01T23-16-21.173986.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_01T23_16_21.173986", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T23-16-21.173986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T23-16-21.173986.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_01T23_16_21.173986", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T23-16-21.173986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T23-16-21.173986.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_01T23_16_21.173986", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T23-16-21.173986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T23-16-21.173986.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_01T23_16_21.173986", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-01T23-16-21.173986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-01T23-16-21.173986.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_01T23_16_21.173986", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T23-16-21.173986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T23-16-21.173986.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_01T23_16_21.173986", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T23-16-21.173986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T23-16-21.173986.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_01T23_16_21.173986", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T23-16-21.173986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T23-16-21.173986.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_01T23_16_21.173986", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T23-16-21.173986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T23-16-21.173986.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_01T23_16_21.173986", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T23-16-21.173986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T23-16-21.173986.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_01T23_16_21.173986", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T23-16-21.173986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T23-16-21.173986.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_01T23_16_21.173986", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T23-16-21.173986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T23-16-21.173986.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_01T23_16_21.173986", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T23-16-21.173986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T23-16-21.173986.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_01T23_16_21.173986", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T23-16-21.173986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T23-16-21.173986.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_01T23_16_21.173986", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T23-16-21.173986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T23-16-21.173986.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_01T23_16_21.173986", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T23-16-21.173986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T23-16-21.173986.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_01T23_16_21.173986", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T23-16-21.173986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T23-16-21.173986.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_01T23_16_21.173986", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T23-16-21.173986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T23-16-21.173986.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_01T23_16_21.173986", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T23-16-21.173986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T23-16-21.173986.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_01T23_16_21.173986", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-01T23-16-21.173986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-01T23-16-21.173986.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_01T23_16_21.173986", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T23-16-21.173986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T23-16-21.173986.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_01T23_16_21.173986", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-01T23-16-21.173986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-01T23-16-21.173986.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_01T23_16_21.173986", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T23-16-21.173986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T23-16-21.173986.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_01T23_16_21.173986", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T23-16-21.173986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T23-16-21.173986.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_01T23_16_21.173986", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T23-16-21.173986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T23-16-21.173986.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_01T23_16_21.173986", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-01T23-16-21.173986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-01T23-16-21.173986.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_01T23_16_21.173986", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-01T23-16-21.173986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-01T23-16-21.173986.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_01T23_16_21.173986", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T23-16-21.173986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T23-16-21.173986.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_01T23_16_21.173986", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T23-16-21.173986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T23-16-21.173986.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_01T23_16_21.173986", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T23-16-21.173986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T23-16-21.173986.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_01T23_16_21.173986", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T23-16-21.173986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T23-16-21.173986.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_01T23_16_21.173986", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-01T23-16-21.173986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-01T23-16-21.173986.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_01T23_16_21.173986", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-01T23-16-21.173986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-01T23-16-21.173986.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_01T23_16_21.173986", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-01T23-16-21.173986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-01T23-16-21.173986.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_01T23_16_21.173986", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T23-16-21.173986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T23-16-21.173986.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_01T23_16_21.173986", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-01T23-16-21.173986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-01T23-16-21.173986.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_01T23_16_21.173986", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T23-16-21.173986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T23-16-21.173986.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_01T23_16_21.173986", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T23-16-21.173986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T23-16-21.173986.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_01T23_16_21.173986", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-01T23-16-21.173986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-01T23-16-21.173986.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_01T23_16_21.173986", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-01T23-16-21.173986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-01T23-16-21.173986.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_01T23_16_21.173986", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-01T23-16-21.173986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-01T23-16-21.173986.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_01T23_16_21.173986", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T23-16-21.173986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T23-16-21.173986.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_01T23_16_21.173986", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-01T23-16-21.173986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-01T23-16-21.173986.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_01T23_16_21.173986", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-01T23-16-21.173986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-01T23-16-21.173986.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_01T23_16_21.173986", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-01T23-16-21.173986.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-01T23-16-21.173986.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_01T23_16_21.173986", "path": ["**/details_harness|winogrande|5_2024-02-01T23-16-21.173986.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-01T23-16-21.173986.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_01T23_16_21.173986", "path": ["results_2024-02-01T23-16-21.173986.parquet"]}, {"split": "latest", "path": ["results_2024-02-01T23-16-21.173986.parquet"]}]}]}
2024-02-01T23:19:05+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of KnutJaegersberg/YaYi-30b-EverythingLM Dataset automatically created during the evaluation run of model KnutJaegersberg/YaYi-30b-EverythingLM on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-02-01T23:16:21.173986(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of KnutJaegersberg/YaYi-30b-EverythingLM\n\n\n\nDataset automatically created during the evaluation run of model KnutJaegersberg/YaYi-30b-EverythingLM on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-02-01T23:16:21.173986(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of KnutJaegersberg/YaYi-30b-EverythingLM\n\n\n\nDataset automatically created during the evaluation run of model KnutJaegersberg/YaYi-30b-EverythingLM on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-02-01T23:16:21.173986(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
5cfd553c77b1a30312427c060f22ae4bb55b938e
# Dataset Card for Evaluation run of TomGrc/FusionNet_7Bx2_MoE_v0.1 <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [TomGrc/FusionNet_7Bx2_MoE_v0.1](https://huggingface.co/TomGrc/FusionNet_7Bx2_MoE_v0.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_TomGrc__FusionNet_7Bx2_MoE_v0.1", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-02-01T23:18:59.818321](https://huggingface.co/datasets/open-llm-leaderboard/details_TomGrc__FusionNet_7Bx2_MoE_v0.1/blob/main/results_2024-02-01T23-18-59.818321.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6567476767898744, "acc_stderr": 0.0319983885224105, "acc_norm": 0.6555568221631944, "acc_norm_stderr": 0.03268292282008458, "mc1": 0.5826193390452876, "mc1_stderr": 0.017262891063272164, "mc2": 0.7120018677798674, "mc2_stderr": 0.014772831374257856 }, "harness|arc:challenge|25": { "acc": 0.7141638225255973, "acc_stderr": 0.01320319608853737, "acc_norm": 0.7406143344709898, "acc_norm_stderr": 0.012808273573927104 }, "harness|hellaswag|10": { "acc": 0.7210714997012547, "acc_stderr": 0.004475557360359705, "acc_norm": 0.8889663413662617, "acc_norm_stderr": 0.0031353173122281226 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.34, "acc_stderr": 0.04760952285695235, "acc_norm": 0.34, "acc_norm_stderr": 0.04760952285695235 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.674074074074074, "acc_stderr": 0.040491220417025055, "acc_norm": 0.674074074074074, "acc_norm_stderr": 0.040491220417025055 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.6973684210526315, "acc_stderr": 0.037385206761196686, "acc_norm": 0.6973684210526315, "acc_norm_stderr": 0.037385206761196686 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.65, "acc_stderr": 0.0479372485441102, "acc_norm": 0.65, "acc_norm_stderr": 0.0479372485441102 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.7056603773584905, "acc_stderr": 0.02804918631569525, "acc_norm": 0.7056603773584905, "acc_norm_stderr": 0.02804918631569525 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.7708333333333334, "acc_stderr": 0.03514697467862388, "acc_norm": 0.7708333333333334, "acc_norm_stderr": 0.03514697467862388 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.48, "acc_stderr": 0.050211673156867795, "acc_norm": 0.48, "acc_norm_stderr": 0.050211673156867795 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.53, "acc_stderr": 0.050161355804659205, "acc_norm": 0.53, "acc_norm_stderr": 0.050161355804659205 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.3, "acc_stderr": 0.046056618647183814, "acc_norm": 0.3, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6589595375722543, "acc_stderr": 0.03614665424180826, "acc_norm": 0.6589595375722543, "acc_norm_stderr": 0.03614665424180826 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.39215686274509803, "acc_stderr": 0.048580835742663454, "acc_norm": 0.39215686274509803, "acc_norm_stderr": 0.048580835742663454 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.75, "acc_stderr": 0.04351941398892446, "acc_norm": 0.75, "acc_norm_stderr": 0.04351941398892446 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.5659574468085107, "acc_stderr": 0.03240038086792747, "acc_norm": 0.5659574468085107, "acc_norm_stderr": 0.03240038086792747 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.49122807017543857, "acc_stderr": 0.04702880432049615, "acc_norm": 0.49122807017543857, "acc_norm_stderr": 0.04702880432049615 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5448275862068965, "acc_stderr": 0.04149886942192117, "acc_norm": 0.5448275862068965, "acc_norm_stderr": 0.04149886942192117 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.42063492063492064, "acc_stderr": 0.025424835086923996, "acc_norm": 0.42063492063492064, "acc_norm_stderr": 0.025424835086923996 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.5079365079365079, "acc_stderr": 0.044715725362943486, "acc_norm": 0.5079365079365079, "acc_norm_stderr": 0.044715725362943486 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.34, "acc_stderr": 0.04760952285695235, "acc_norm": 0.34, "acc_norm_stderr": 0.04760952285695235 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.7967741935483871, "acc_stderr": 0.02289168798455496, "acc_norm": 0.7967741935483871, "acc_norm_stderr": 0.02289168798455496 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.5172413793103449, "acc_stderr": 0.035158955511656986, "acc_norm": 0.5172413793103449, "acc_norm_stderr": 0.035158955511656986 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.7, "acc_stderr": 0.046056618647183814, "acc_norm": 0.7, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7757575757575758, "acc_stderr": 0.032568666616811015, "acc_norm": 0.7757575757575758, "acc_norm_stderr": 0.032568666616811015 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.8080808080808081, "acc_stderr": 0.028057791672989017, "acc_norm": 0.8080808080808081, "acc_norm_stderr": 0.028057791672989017 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.9015544041450777, "acc_stderr": 0.021500249576033456, "acc_norm": 0.9015544041450777, "acc_norm_stderr": 0.021500249576033456 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.6564102564102564, "acc_stderr": 0.024078696580635477, "acc_norm": 0.6564102564102564, "acc_norm_stderr": 0.024078696580635477 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.34444444444444444, "acc_stderr": 0.028972648884844267, "acc_norm": 0.34444444444444444, "acc_norm_stderr": 0.028972648884844267 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.680672268907563, "acc_stderr": 0.030283995525884396, "acc_norm": 0.680672268907563, "acc_norm_stderr": 0.030283995525884396 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.37748344370860926, "acc_stderr": 0.0395802723112157, "acc_norm": 0.37748344370860926, "acc_norm_stderr": 0.0395802723112157 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8513761467889909, "acc_stderr": 0.015251253773660834, "acc_norm": 0.8513761467889909, "acc_norm_stderr": 0.015251253773660834 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.5, "acc_stderr": 0.034099716973523674, "acc_norm": 0.5, "acc_norm_stderr": 0.034099716973523674 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.8431372549019608, "acc_stderr": 0.025524722324553346, "acc_norm": 0.8431372549019608, "acc_norm_stderr": 0.025524722324553346 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.8059071729957806, "acc_stderr": 0.025744902532290913, "acc_norm": 0.8059071729957806, "acc_norm_stderr": 0.025744902532290913 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6860986547085202, "acc_stderr": 0.031146796482972465, "acc_norm": 0.6860986547085202, "acc_norm_stderr": 0.031146796482972465 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.8015267175572519, "acc_stderr": 0.03498149385462472, "acc_norm": 0.8015267175572519, "acc_norm_stderr": 0.03498149385462472 }, "harness|hendrycksTest-international_law|5": { "acc": 0.7933884297520661, "acc_stderr": 0.03695980128098824, "acc_norm": 0.7933884297520661, "acc_norm_stderr": 0.03695980128098824 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7777777777777778, "acc_stderr": 0.0401910747255735, "acc_norm": 0.7777777777777778, "acc_norm_stderr": 0.0401910747255735 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7730061349693251, "acc_stderr": 0.03291099578615769, "acc_norm": 0.7730061349693251, "acc_norm_stderr": 0.03291099578615769 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.4732142857142857, "acc_stderr": 0.047389751192741546, "acc_norm": 0.4732142857142857, "acc_norm_stderr": 0.047389751192741546 }, "harness|hendrycksTest-management|5": { "acc": 0.7766990291262136, "acc_stderr": 0.04123553189891431, "acc_norm": 0.7766990291262136, "acc_norm_stderr": 0.04123553189891431 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8846153846153846, "acc_stderr": 0.02093019318517933, "acc_norm": 0.8846153846153846, "acc_norm_stderr": 0.02093019318517933 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.72, "acc_stderr": 0.045126085985421276, "acc_norm": 0.72, "acc_norm_stderr": 0.045126085985421276 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8288633461047255, "acc_stderr": 0.013468201614066307, "acc_norm": 0.8288633461047255, "acc_norm_stderr": 0.013468201614066307 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.7254335260115607, "acc_stderr": 0.02402774515526502, "acc_norm": 0.7254335260115607, "acc_norm_stderr": 0.02402774515526502 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.4480446927374302, "acc_stderr": 0.016631976628930595, "acc_norm": 0.4480446927374302, "acc_norm_stderr": 0.016631976628930595 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.7320261437908496, "acc_stderr": 0.025360603796242557, "acc_norm": 0.7320261437908496, "acc_norm_stderr": 0.025360603796242557 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.7170418006430869, "acc_stderr": 0.025583062489984813, "acc_norm": 0.7170418006430869, "acc_norm_stderr": 0.025583062489984813 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.75, "acc_stderr": 0.02409347123262133, "acc_norm": 0.75, "acc_norm_stderr": 0.02409347123262133 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.49645390070921985, "acc_stderr": 0.02982674915328092, "acc_norm": 0.49645390070921985, "acc_norm_stderr": 0.02982674915328092 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.46740547588005216, "acc_stderr": 0.012743072942653344, "acc_norm": 0.46740547588005216, "acc_norm_stderr": 0.012743072942653344 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.6691176470588235, "acc_stderr": 0.02858270975389845, "acc_norm": 0.6691176470588235, "acc_norm_stderr": 0.02858270975389845 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6666666666666666, "acc_stderr": 0.019070985589687495, "acc_norm": 0.6666666666666666, "acc_norm_stderr": 0.019070985589687495 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6636363636363637, "acc_stderr": 0.04525393596302506, "acc_norm": 0.6636363636363637, "acc_norm_stderr": 0.04525393596302506 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.7306122448979592, "acc_stderr": 0.02840125202902294, "acc_norm": 0.7306122448979592, "acc_norm_stderr": 0.02840125202902294 }, "harness|hendrycksTest-sociology|5": { "acc": 0.835820895522388, "acc_stderr": 0.026193923544454125, "acc_norm": 0.835820895522388, "acc_norm_stderr": 0.026193923544454125 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.85, "acc_stderr": 0.0358870281282637, "acc_norm": 0.85, "acc_norm_stderr": 0.0358870281282637 }, "harness|hendrycksTest-virology|5": { "acc": 0.5602409638554217, "acc_stderr": 0.03864139923699122, "acc_norm": 0.5602409638554217, "acc_norm_stderr": 0.03864139923699122 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8421052631578947, "acc_stderr": 0.027966785859160893, "acc_norm": 0.8421052631578947, "acc_norm_stderr": 0.027966785859160893 }, "harness|truthfulqa:mc|0": { "mc1": 0.5826193390452876, "mc1_stderr": 0.017262891063272164, "mc2": 0.7120018677798674, "mc2_stderr": 0.014772831374257856 }, "harness|winogrande|5": { "acc": 0.8752959747434885, "acc_stderr": 0.009285404952684428 }, "harness|gsm8k|5": { "acc": 0.7028051554207733, "acc_stderr": 0.012588685966624179 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_TomGrc__FusionNet_7Bx2_MoE_v0.1
[ "region:us" ]
2024-02-01T23:21:17+00:00
{"pretty_name": "Evaluation run of TomGrc/FusionNet_7Bx2_MoE_v0.1", "dataset_summary": "Dataset automatically created during the evaluation run of model [TomGrc/FusionNet_7Bx2_MoE_v0.1](https://huggingface.co/TomGrc/FusionNet_7Bx2_MoE_v0.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TomGrc__FusionNet_7Bx2_MoE_v0.1\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-01T23:18:59.818321](https://huggingface.co/datasets/open-llm-leaderboard/details_TomGrc__FusionNet_7Bx2_MoE_v0.1/blob/main/results_2024-02-01T23-18-59.818321.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6567476767898744,\n \"acc_stderr\": 0.0319983885224105,\n \"acc_norm\": 0.6555568221631944,\n \"acc_norm_stderr\": 0.03268292282008458,\n \"mc1\": 0.5826193390452876,\n \"mc1_stderr\": 0.017262891063272164,\n \"mc2\": 0.7120018677798674,\n \"mc2_stderr\": 0.014772831374257856\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.7141638225255973,\n \"acc_stderr\": 0.01320319608853737,\n \"acc_norm\": 0.7406143344709898,\n \"acc_norm_stderr\": 0.012808273573927104\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7210714997012547,\n \"acc_stderr\": 0.004475557360359705,\n \"acc_norm\": 0.8889663413662617,\n \"acc_norm_stderr\": 0.0031353173122281226\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.674074074074074,\n \"acc_stderr\": 0.040491220417025055,\n \"acc_norm\": 0.674074074074074,\n \"acc_norm_stderr\": 0.040491220417025055\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6973684210526315,\n \"acc_stderr\": 0.037385206761196686,\n \"acc_norm\": 0.6973684210526315,\n \"acc_norm_stderr\": 0.037385206761196686\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.65,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7056603773584905,\n \"acc_stderr\": 0.02804918631569525,\n \"acc_norm\": 0.7056603773584905,\n \"acc_norm_stderr\": 0.02804918631569525\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7708333333333334,\n \"acc_stderr\": 0.03514697467862388,\n \"acc_norm\": 0.7708333333333334,\n \"acc_norm_stderr\": 0.03514697467862388\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.53,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6589595375722543,\n \"acc_stderr\": 0.03614665424180826,\n \"acc_norm\": 0.6589595375722543,\n \"acc_norm_stderr\": 0.03614665424180826\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.048580835742663454,\n \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.048580835742663454\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5659574468085107,\n \"acc_stderr\": 0.03240038086792747,\n \"acc_norm\": 0.5659574468085107,\n \"acc_norm_stderr\": 0.03240038086792747\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.49122807017543857,\n \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.49122807017543857,\n \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5448275862068965,\n \"acc_stderr\": 0.04149886942192117,\n \"acc_norm\": 0.5448275862068965,\n \"acc_norm_stderr\": 0.04149886942192117\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.42063492063492064,\n \"acc_stderr\": 0.025424835086923996,\n \"acc_norm\": 0.42063492063492064,\n \"acc_norm_stderr\": 0.025424835086923996\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5079365079365079,\n \"acc_stderr\": 0.044715725362943486,\n \"acc_norm\": 0.5079365079365079,\n \"acc_norm_stderr\": 0.044715725362943486\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7967741935483871,\n \"acc_stderr\": 0.02289168798455496,\n \"acc_norm\": 0.7967741935483871,\n \"acc_norm_stderr\": 0.02289168798455496\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.035158955511656986,\n \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.035158955511656986\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.032568666616811015,\n \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.032568666616811015\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.8080808080808081,\n \"acc_stderr\": 0.028057791672989017,\n \"acc_norm\": 0.8080808080808081,\n \"acc_norm_stderr\": 0.028057791672989017\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9015544041450777,\n \"acc_stderr\": 0.021500249576033456,\n \"acc_norm\": 0.9015544041450777,\n \"acc_norm_stderr\": 0.021500249576033456\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6564102564102564,\n \"acc_stderr\": 0.024078696580635477,\n \"acc_norm\": 0.6564102564102564,\n \"acc_norm_stderr\": 0.024078696580635477\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.34444444444444444,\n \"acc_stderr\": 0.028972648884844267,\n \"acc_norm\": 0.34444444444444444,\n \"acc_norm_stderr\": 0.028972648884844267\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.680672268907563,\n \"acc_stderr\": 0.030283995525884396,\n \"acc_norm\": 0.680672268907563,\n \"acc_norm_stderr\": 0.030283995525884396\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.37748344370860926,\n \"acc_stderr\": 0.0395802723112157,\n \"acc_norm\": 0.37748344370860926,\n \"acc_norm_stderr\": 0.0395802723112157\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8513761467889909,\n \"acc_stderr\": 0.015251253773660834,\n \"acc_norm\": 0.8513761467889909,\n \"acc_norm_stderr\": 0.015251253773660834\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.034099716973523674,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.034099716973523674\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8431372549019608,\n \"acc_stderr\": 0.025524722324553346,\n \"acc_norm\": 0.8431372549019608,\n \"acc_norm_stderr\": 0.025524722324553346\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8059071729957806,\n \"acc_stderr\": 0.025744902532290913,\n \"acc_norm\": 0.8059071729957806,\n \"acc_norm_stderr\": 0.025744902532290913\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8015267175572519,\n \"acc_stderr\": 0.03498149385462472,\n \"acc_norm\": 0.8015267175572519,\n \"acc_norm_stderr\": 0.03498149385462472\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7933884297520661,\n \"acc_stderr\": 0.03695980128098824,\n \"acc_norm\": 0.7933884297520661,\n \"acc_norm_stderr\": 0.03695980128098824\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.0401910747255735,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.0401910747255735\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7730061349693251,\n \"acc_stderr\": 0.03291099578615769,\n \"acc_norm\": 0.7730061349693251,\n \"acc_norm_stderr\": 0.03291099578615769\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4732142857142857,\n \"acc_stderr\": 0.047389751192741546,\n \"acc_norm\": 0.4732142857142857,\n \"acc_norm_stderr\": 0.047389751192741546\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8846153846153846,\n \"acc_stderr\": 0.02093019318517933,\n \"acc_norm\": 0.8846153846153846,\n \"acc_norm_stderr\": 0.02093019318517933\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8288633461047255,\n \"acc_stderr\": 0.013468201614066307,\n \"acc_norm\": 0.8288633461047255,\n \"acc_norm_stderr\": 0.013468201614066307\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7254335260115607,\n \"acc_stderr\": 0.02402774515526502,\n \"acc_norm\": 0.7254335260115607,\n \"acc_norm_stderr\": 0.02402774515526502\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4480446927374302,\n \"acc_stderr\": 0.016631976628930595,\n \"acc_norm\": 0.4480446927374302,\n \"acc_norm_stderr\": 0.016631976628930595\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7320261437908496,\n \"acc_stderr\": 0.025360603796242557,\n \"acc_norm\": 0.7320261437908496,\n \"acc_norm_stderr\": 0.025360603796242557\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7170418006430869,\n \"acc_stderr\": 0.025583062489984813,\n \"acc_norm\": 0.7170418006430869,\n \"acc_norm_stderr\": 0.025583062489984813\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.02409347123262133,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.02409347123262133\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.49645390070921985,\n \"acc_stderr\": 0.02982674915328092,\n \"acc_norm\": 0.49645390070921985,\n \"acc_norm_stderr\": 0.02982674915328092\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46740547588005216,\n \"acc_stderr\": 0.012743072942653344,\n \"acc_norm\": 0.46740547588005216,\n \"acc_norm_stderr\": 0.012743072942653344\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6691176470588235,\n \"acc_stderr\": 0.02858270975389845,\n \"acc_norm\": 0.6691176470588235,\n \"acc_norm_stderr\": 0.02858270975389845\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.019070985589687495,\n \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.019070985589687495\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7306122448979592,\n \"acc_stderr\": 0.02840125202902294,\n \"acc_norm\": 0.7306122448979592,\n \"acc_norm_stderr\": 0.02840125202902294\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n \"acc_stderr\": 0.026193923544454125,\n \"acc_norm\": 0.835820895522388,\n \"acc_norm_stderr\": 0.026193923544454125\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.85,\n \"acc_stderr\": 0.0358870281282637,\n \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.0358870281282637\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5602409638554217,\n \"acc_stderr\": 0.03864139923699122,\n \"acc_norm\": 0.5602409638554217,\n \"acc_norm_stderr\": 0.03864139923699122\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8421052631578947,\n \"acc_stderr\": 0.027966785859160893,\n \"acc_norm\": 0.8421052631578947,\n \"acc_norm_stderr\": 0.027966785859160893\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5826193390452876,\n \"mc1_stderr\": 0.017262891063272164,\n \"mc2\": 0.7120018677798674,\n \"mc2_stderr\": 0.014772831374257856\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8752959747434885,\n \"acc_stderr\": 0.009285404952684428\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7028051554207733,\n \"acc_stderr\": 0.012588685966624179\n }\n}\n```", "repo_url": "https://huggingface.co/TomGrc/FusionNet_7Bx2_MoE_v0.1", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_01T23_18_59.818321", "path": ["**/details_harness|arc:challenge|25_2024-02-01T23-18-59.818321.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-01T23-18-59.818321.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_01T23_18_59.818321", "path": ["**/details_harness|gsm8k|5_2024-02-01T23-18-59.818321.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-01T23-18-59.818321.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_01T23_18_59.818321", "path": ["**/details_harness|hellaswag|10_2024-02-01T23-18-59.818321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-01T23-18-59.818321.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_01T23_18_59.818321", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T23-18-59.818321.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-01T23-18-59.818321.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-01T23-18-59.818321.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T23-18-59.818321.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T23-18-59.818321.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-01T23-18-59.818321.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T23-18-59.818321.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T23-18-59.818321.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T23-18-59.818321.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T23-18-59.818321.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-01T23-18-59.818321.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-01T23-18-59.818321.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T23-18-59.818321.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-01T23-18-59.818321.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T23-18-59.818321.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T23-18-59.818321.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T23-18-59.818321.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-01T23-18-59.818321.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T23-18-59.818321.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T23-18-59.818321.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T23-18-59.818321.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T23-18-59.818321.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T23-18-59.818321.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T23-18-59.818321.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T23-18-59.818321.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T23-18-59.818321.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T23-18-59.818321.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T23-18-59.818321.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T23-18-59.818321.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T23-18-59.818321.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T23-18-59.818321.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T23-18-59.818321.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-01T23-18-59.818321.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T23-18-59.818321.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-01T23-18-59.818321.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T23-18-59.818321.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T23-18-59.818321.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T23-18-59.818321.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-01T23-18-59.818321.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-01T23-18-59.818321.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T23-18-59.818321.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T23-18-59.818321.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T23-18-59.818321.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T23-18-59.818321.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-01T23-18-59.818321.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-01T23-18-59.818321.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-01T23-18-59.818321.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T23-18-59.818321.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-01T23-18-59.818321.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T23-18-59.818321.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T23-18-59.818321.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-01T23-18-59.818321.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-01T23-18-59.818321.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-01T23-18-59.818321.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T23-18-59.818321.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-01T23-18-59.818321.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-01T23-18-59.818321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T23-18-59.818321.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-01T23-18-59.818321.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-01T23-18-59.818321.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T23-18-59.818321.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T23-18-59.818321.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-01T23-18-59.818321.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T23-18-59.818321.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T23-18-59.818321.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T23-18-59.818321.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T23-18-59.818321.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-01T23-18-59.818321.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-01T23-18-59.818321.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T23-18-59.818321.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-01T23-18-59.818321.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T23-18-59.818321.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T23-18-59.818321.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T23-18-59.818321.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-01T23-18-59.818321.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T23-18-59.818321.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T23-18-59.818321.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T23-18-59.818321.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T23-18-59.818321.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T23-18-59.818321.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T23-18-59.818321.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T23-18-59.818321.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T23-18-59.818321.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T23-18-59.818321.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T23-18-59.818321.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T23-18-59.818321.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T23-18-59.818321.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T23-18-59.818321.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T23-18-59.818321.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-01T23-18-59.818321.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T23-18-59.818321.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-01T23-18-59.818321.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T23-18-59.818321.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T23-18-59.818321.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T23-18-59.818321.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-01T23-18-59.818321.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-01T23-18-59.818321.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T23-18-59.818321.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T23-18-59.818321.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T23-18-59.818321.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T23-18-59.818321.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-01T23-18-59.818321.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-01T23-18-59.818321.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-01T23-18-59.818321.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T23-18-59.818321.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-01T23-18-59.818321.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T23-18-59.818321.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T23-18-59.818321.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-01T23-18-59.818321.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-01T23-18-59.818321.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-01T23-18-59.818321.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T23-18-59.818321.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-01T23-18-59.818321.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-01T23-18-59.818321.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_01T23_18_59.818321", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T23-18-59.818321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T23-18-59.818321.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_01T23_18_59.818321", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-01T23-18-59.818321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-01T23-18-59.818321.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_01T23_18_59.818321", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-01T23-18-59.818321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-01T23-18-59.818321.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_01T23_18_59.818321", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T23-18-59.818321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T23-18-59.818321.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_01T23_18_59.818321", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T23-18-59.818321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T23-18-59.818321.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_01T23_18_59.818321", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-01T23-18-59.818321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-01T23-18-59.818321.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_01T23_18_59.818321", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T23-18-59.818321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T23-18-59.818321.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_01T23_18_59.818321", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T23-18-59.818321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T23-18-59.818321.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_01T23_18_59.818321", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T23-18-59.818321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T23-18-59.818321.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_01T23_18_59.818321", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T23-18-59.818321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T23-18-59.818321.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_01T23_18_59.818321", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-01T23-18-59.818321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-01T23-18-59.818321.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_01T23_18_59.818321", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-01T23-18-59.818321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-01T23-18-59.818321.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_01T23_18_59.818321", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T23-18-59.818321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T23-18-59.818321.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_01T23_18_59.818321", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-01T23-18-59.818321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-01T23-18-59.818321.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_01T23_18_59.818321", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T23-18-59.818321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T23-18-59.818321.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_01T23_18_59.818321", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T23-18-59.818321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T23-18-59.818321.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_01T23_18_59.818321", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T23-18-59.818321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T23-18-59.818321.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_01T23_18_59.818321", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-01T23-18-59.818321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-01T23-18-59.818321.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_01T23_18_59.818321", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T23-18-59.818321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T23-18-59.818321.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_01T23_18_59.818321", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T23-18-59.818321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T23-18-59.818321.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_01T23_18_59.818321", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T23-18-59.818321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T23-18-59.818321.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_01T23_18_59.818321", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T23-18-59.818321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T23-18-59.818321.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_01T23_18_59.818321", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T23-18-59.818321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T23-18-59.818321.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_01T23_18_59.818321", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T23-18-59.818321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T23-18-59.818321.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_01T23_18_59.818321", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T23-18-59.818321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T23-18-59.818321.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_01T23_18_59.818321", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T23-18-59.818321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T23-18-59.818321.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_01T23_18_59.818321", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T23-18-59.818321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T23-18-59.818321.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_01T23_18_59.818321", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T23-18-59.818321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T23-18-59.818321.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_01T23_18_59.818321", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T23-18-59.818321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T23-18-59.818321.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_01T23_18_59.818321", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T23-18-59.818321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T23-18-59.818321.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_01T23_18_59.818321", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T23-18-59.818321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T23-18-59.818321.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_01T23_18_59.818321", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T23-18-59.818321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T23-18-59.818321.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_01T23_18_59.818321", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-01T23-18-59.818321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-01T23-18-59.818321.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_01T23_18_59.818321", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T23-18-59.818321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T23-18-59.818321.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_01T23_18_59.818321", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-01T23-18-59.818321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-01T23-18-59.818321.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_01T23_18_59.818321", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T23-18-59.818321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T23-18-59.818321.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_01T23_18_59.818321", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T23-18-59.818321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T23-18-59.818321.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_01T23_18_59.818321", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T23-18-59.818321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T23-18-59.818321.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_01T23_18_59.818321", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-01T23-18-59.818321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-01T23-18-59.818321.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_01T23_18_59.818321", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-01T23-18-59.818321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-01T23-18-59.818321.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_01T23_18_59.818321", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T23-18-59.818321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T23-18-59.818321.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_01T23_18_59.818321", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T23-18-59.818321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T23-18-59.818321.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_01T23_18_59.818321", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T23-18-59.818321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T23-18-59.818321.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_01T23_18_59.818321", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T23-18-59.818321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T23-18-59.818321.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_01T23_18_59.818321", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-01T23-18-59.818321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-01T23-18-59.818321.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_01T23_18_59.818321", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-01T23-18-59.818321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-01T23-18-59.818321.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_01T23_18_59.818321", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-01T23-18-59.818321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-01T23-18-59.818321.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_01T23_18_59.818321", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T23-18-59.818321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T23-18-59.818321.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_01T23_18_59.818321", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-01T23-18-59.818321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-01T23-18-59.818321.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_01T23_18_59.818321", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T23-18-59.818321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T23-18-59.818321.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_01T23_18_59.818321", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T23-18-59.818321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T23-18-59.818321.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_01T23_18_59.818321", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-01T23-18-59.818321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-01T23-18-59.818321.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_01T23_18_59.818321", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-01T23-18-59.818321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-01T23-18-59.818321.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_01T23_18_59.818321", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-01T23-18-59.818321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-01T23-18-59.818321.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_01T23_18_59.818321", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T23-18-59.818321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T23-18-59.818321.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_01T23_18_59.818321", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-01T23-18-59.818321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-01T23-18-59.818321.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_01T23_18_59.818321", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-01T23-18-59.818321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-01T23-18-59.818321.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_01T23_18_59.818321", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-01T23-18-59.818321.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-01T23-18-59.818321.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_01T23_18_59.818321", "path": ["**/details_harness|winogrande|5_2024-02-01T23-18-59.818321.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-01T23-18-59.818321.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_01T23_18_59.818321", "path": ["results_2024-02-01T23-18-59.818321.parquet"]}, {"split": "latest", "path": ["results_2024-02-01T23-18-59.818321.parquet"]}]}]}
2024-02-01T23:21:40+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of TomGrc/FusionNet_7Bx2_MoE_v0.1 Dataset automatically created during the evaluation run of model TomGrc/FusionNet_7Bx2_MoE_v0.1 on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-02-01T23:18:59.818321(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of TomGrc/FusionNet_7Bx2_MoE_v0.1\n\n\n\nDataset automatically created during the evaluation run of model TomGrc/FusionNet_7Bx2_MoE_v0.1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-02-01T23:18:59.818321(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of TomGrc/FusionNet_7Bx2_MoE_v0.1\n\n\n\nDataset automatically created during the evaluation run of model TomGrc/FusionNet_7Bx2_MoE_v0.1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-02-01T23:18:59.818321(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
c2d4e12d4d020e6752b1f29165052172fcd6fcaf
# Dataset Card for Evaluation run of alnrg2arg/test3_sft_16bit_dpo2 <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [alnrg2arg/test3_sft_16bit_dpo2](https://huggingface.co/alnrg2arg/test3_sft_16bit_dpo2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_alnrg2arg__test3_sft_16bit_dpo2", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-02-01T23:26:26.833091](https://huggingface.co/datasets/open-llm-leaderboard/details_alnrg2arg__test3_sft_16bit_dpo2/blob/main/results_2024-02-01T23-26-26.833091.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6525115378138006, "acc_stderr": 0.0321441319562114, "acc_norm": 0.6519969439704142, "acc_norm_stderr": 0.032814506961459725, "mc1": 0.5960832313341493, "mc1_stderr": 0.01717727682258428, "mc2": 0.7071252546997986, "mc2_stderr": 0.015071123394943023 }, "harness|arc:challenge|25": { "acc": 0.7158703071672355, "acc_stderr": 0.013179442447653886, "acc_norm": 0.7363481228668942, "acc_norm_stderr": 0.012875929151297046 }, "harness|hellaswag|10": { "acc": 0.7276438956383191, "acc_stderr": 0.004442623590846324, "acc_norm": 0.8902609042023502, "acc_norm_stderr": 0.0031192548288489453 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.35, "acc_stderr": 0.0479372485441102, "acc_norm": 0.35, "acc_norm_stderr": 0.0479372485441102 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.6370370370370371, "acc_stderr": 0.04153948404742398, "acc_norm": 0.6370370370370371, "acc_norm_stderr": 0.04153948404742398 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.6973684210526315, "acc_stderr": 0.03738520676119669, "acc_norm": 0.6973684210526315, "acc_norm_stderr": 0.03738520676119669 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.65, "acc_stderr": 0.0479372485441102, "acc_norm": 0.65, "acc_norm_stderr": 0.0479372485441102 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.720754716981132, "acc_stderr": 0.027611163402399715, "acc_norm": 0.720754716981132, "acc_norm_stderr": 0.027611163402399715 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.7638888888888888, "acc_stderr": 0.03551446610810826, "acc_norm": 0.7638888888888888, "acc_norm_stderr": 0.03551446610810826 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.51, "acc_stderr": 0.05024183937956912, "acc_norm": 0.51, "acc_norm_stderr": 0.05024183937956912 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.53, "acc_stderr": 0.050161355804659205, "acc_norm": 0.53, "acc_norm_stderr": 0.050161355804659205 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.33, "acc_stderr": 0.047258156262526045, "acc_norm": 0.33, "acc_norm_stderr": 0.047258156262526045 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6473988439306358, "acc_stderr": 0.036430371689585475, "acc_norm": 0.6473988439306358, "acc_norm_stderr": 0.036430371689585475 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.4019607843137255, "acc_stderr": 0.04878608714466996, "acc_norm": 0.4019607843137255, "acc_norm_stderr": 0.04878608714466996 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.76, "acc_stderr": 0.04292346959909284, "acc_norm": 0.76, "acc_norm_stderr": 0.04292346959909284 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.574468085106383, "acc_stderr": 0.03232146916224468, "acc_norm": 0.574468085106383, "acc_norm_stderr": 0.03232146916224468 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.4824561403508772, "acc_stderr": 0.04700708033551038, "acc_norm": 0.4824561403508772, "acc_norm_stderr": 0.04700708033551038 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5586206896551724, "acc_stderr": 0.04137931034482757, "acc_norm": 0.5586206896551724, "acc_norm_stderr": 0.04137931034482757 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.4074074074074074, "acc_stderr": 0.025305906241590632, "acc_norm": 0.4074074074074074, "acc_norm_stderr": 0.025305906241590632 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.4603174603174603, "acc_stderr": 0.04458029125470973, "acc_norm": 0.4603174603174603, "acc_norm_stderr": 0.04458029125470973 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.37, "acc_stderr": 0.048523658709391, "acc_norm": 0.37, "acc_norm_stderr": 0.048523658709391 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.7903225806451613, "acc_stderr": 0.023157879349083525, "acc_norm": 0.7903225806451613, "acc_norm_stderr": 0.023157879349083525 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.4975369458128079, "acc_stderr": 0.03517945038691063, "acc_norm": 0.4975369458128079, "acc_norm_stderr": 0.03517945038691063 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.7, "acc_stderr": 0.046056618647183814, "acc_norm": 0.7, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7696969696969697, "acc_stderr": 0.03287666758603491, "acc_norm": 0.7696969696969697, "acc_norm_stderr": 0.03287666758603491 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.7929292929292929, "acc_stderr": 0.028869778460267045, "acc_norm": 0.7929292929292929, "acc_norm_stderr": 0.028869778460267045 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.9015544041450777, "acc_stderr": 0.02150024957603348, "acc_norm": 0.9015544041450777, "acc_norm_stderr": 0.02150024957603348 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.6692307692307692, "acc_stderr": 0.02385479568097112, "acc_norm": 0.6692307692307692, "acc_norm_stderr": 0.02385479568097112 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.3333333333333333, "acc_stderr": 0.02874204090394848, "acc_norm": 0.3333333333333333, "acc_norm_stderr": 0.02874204090394848 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.6722689075630253, "acc_stderr": 0.03048991141767323, "acc_norm": 0.6722689075630253, "acc_norm_stderr": 0.03048991141767323 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.33112582781456956, "acc_stderr": 0.038425817186598696, "acc_norm": 0.33112582781456956, "acc_norm_stderr": 0.038425817186598696 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8422018348623853, "acc_stderr": 0.015630022970092434, "acc_norm": 0.8422018348623853, "acc_norm_stderr": 0.015630022970092434 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.5138888888888888, "acc_stderr": 0.03408655867977749, "acc_norm": 0.5138888888888888, "acc_norm_stderr": 0.03408655867977749 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.8431372549019608, "acc_stderr": 0.02552472232455334, "acc_norm": 0.8431372549019608, "acc_norm_stderr": 0.02552472232455334 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.7848101265822784, "acc_stderr": 0.026750826994676177, "acc_norm": 0.7848101265822784, "acc_norm_stderr": 0.026750826994676177 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6905829596412556, "acc_stderr": 0.03102441174057221, "acc_norm": 0.6905829596412556, "acc_norm_stderr": 0.03102441174057221 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.7709923664122137, "acc_stderr": 0.036853466317118506, "acc_norm": 0.7709923664122137, "acc_norm_stderr": 0.036853466317118506 }, "harness|hendrycksTest-international_law|5": { "acc": 0.7768595041322314, "acc_stderr": 0.03800754475228732, "acc_norm": 0.7768595041322314, "acc_norm_stderr": 0.03800754475228732 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7685185185185185, "acc_stderr": 0.04077494709252627, "acc_norm": 0.7685185185185185, "acc_norm_stderr": 0.04077494709252627 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7791411042944786, "acc_stderr": 0.03259177392742178, "acc_norm": 0.7791411042944786, "acc_norm_stderr": 0.03259177392742178 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.41964285714285715, "acc_stderr": 0.046840993210771065, "acc_norm": 0.41964285714285715, "acc_norm_stderr": 0.046840993210771065 }, "harness|hendrycksTest-management|5": { "acc": 0.7864077669902912, "acc_stderr": 0.040580420156460344, "acc_norm": 0.7864077669902912, "acc_norm_stderr": 0.040580420156460344 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8760683760683761, "acc_stderr": 0.02158649400128137, "acc_norm": 0.8760683760683761, "acc_norm_stderr": 0.02158649400128137 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.67, "acc_stderr": 0.04725815626252609, "acc_norm": 0.67, "acc_norm_stderr": 0.04725815626252609 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8237547892720306, "acc_stderr": 0.013625556907993457, "acc_norm": 0.8237547892720306, "acc_norm_stderr": 0.013625556907993457 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.7398843930635838, "acc_stderr": 0.023618678310069367, "acc_norm": 0.7398843930635838, "acc_norm_stderr": 0.023618678310069367 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.4435754189944134, "acc_stderr": 0.01661568040100372, "acc_norm": 0.4435754189944134, "acc_norm_stderr": 0.01661568040100372 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.7254901960784313, "acc_stderr": 0.025553169991826524, "acc_norm": 0.7254901960784313, "acc_norm_stderr": 0.025553169991826524 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.7266881028938906, "acc_stderr": 0.02531176597542612, "acc_norm": 0.7266881028938906, "acc_norm_stderr": 0.02531176597542612 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.7530864197530864, "acc_stderr": 0.02399350170904211, "acc_norm": 0.7530864197530864, "acc_norm_stderr": 0.02399350170904211 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.4858156028368794, "acc_stderr": 0.02981549448368206, "acc_norm": 0.4858156028368794, "acc_norm_stderr": 0.02981549448368206 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.4758800521512386, "acc_stderr": 0.012755368722863933, "acc_norm": 0.4758800521512386, "acc_norm_stderr": 0.012755368722863933 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.6691176470588235, "acc_stderr": 0.02858270975389845, "acc_norm": 0.6691176470588235, "acc_norm_stderr": 0.02858270975389845 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6683006535947712, "acc_stderr": 0.01904748523936038, "acc_norm": 0.6683006535947712, "acc_norm_stderr": 0.01904748523936038 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6545454545454545, "acc_stderr": 0.04554619617541054, "acc_norm": 0.6545454545454545, "acc_norm_stderr": 0.04554619617541054 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.7387755102040816, "acc_stderr": 0.02812342933514278, "acc_norm": 0.7387755102040816, "acc_norm_stderr": 0.02812342933514278 }, "harness|hendrycksTest-sociology|5": { "acc": 0.835820895522388, "acc_stderr": 0.026193923544454125, "acc_norm": 0.835820895522388, "acc_norm_stderr": 0.026193923544454125 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.86, "acc_stderr": 0.03487350880197771, "acc_norm": 0.86, "acc_norm_stderr": 0.03487350880197771 }, "harness|hendrycksTest-virology|5": { "acc": 0.572289156626506, "acc_stderr": 0.038515976837185335, "acc_norm": 0.572289156626506, "acc_norm_stderr": 0.038515976837185335 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8362573099415205, "acc_stderr": 0.028380919596145866, "acc_norm": 0.8362573099415205, "acc_norm_stderr": 0.028380919596145866 }, "harness|truthfulqa:mc|0": { "mc1": 0.5960832313341493, "mc1_stderr": 0.01717727682258428, "mc2": 0.7071252546997986, "mc2_stderr": 0.015071123394943023 }, "harness|winogrande|5": { "acc": 0.8437253354380426, "acc_stderr": 0.010205351791873499 }, "harness|gsm8k|5": { "acc": 0.6747536012130402, "acc_stderr": 0.012903904752543917 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_alnrg2arg__test3_sft_16bit_dpo2
[ "region:us" ]
2024-02-01T23:28:47+00:00
{"pretty_name": "Evaluation run of alnrg2arg/test3_sft_16bit_dpo2", "dataset_summary": "Dataset automatically created during the evaluation run of model [alnrg2arg/test3_sft_16bit_dpo2](https://huggingface.co/alnrg2arg/test3_sft_16bit_dpo2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_alnrg2arg__test3_sft_16bit_dpo2\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-01T23:26:26.833091](https://huggingface.co/datasets/open-llm-leaderboard/details_alnrg2arg__test3_sft_16bit_dpo2/blob/main/results_2024-02-01T23-26-26.833091.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6525115378138006,\n \"acc_stderr\": 0.0321441319562114,\n \"acc_norm\": 0.6519969439704142,\n \"acc_norm_stderr\": 0.032814506961459725,\n \"mc1\": 0.5960832313341493,\n \"mc1_stderr\": 0.01717727682258428,\n \"mc2\": 0.7071252546997986,\n \"mc2_stderr\": 0.015071123394943023\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.7158703071672355,\n \"acc_stderr\": 0.013179442447653886,\n \"acc_norm\": 0.7363481228668942,\n \"acc_norm_stderr\": 0.012875929151297046\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7276438956383191,\n \"acc_stderr\": 0.004442623590846324,\n \"acc_norm\": 0.8902609042023502,\n \"acc_norm_stderr\": 0.0031192548288489453\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6370370370370371,\n \"acc_stderr\": 0.04153948404742398,\n \"acc_norm\": 0.6370370370370371,\n \"acc_norm_stderr\": 0.04153948404742398\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6973684210526315,\n \"acc_stderr\": 0.03738520676119669,\n \"acc_norm\": 0.6973684210526315,\n \"acc_norm_stderr\": 0.03738520676119669\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.65,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.720754716981132,\n \"acc_stderr\": 0.027611163402399715,\n \"acc_norm\": 0.720754716981132,\n \"acc_norm_stderr\": 0.027611163402399715\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7638888888888888,\n \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.7638888888888888,\n \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.53,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6473988439306358,\n \"acc_stderr\": 0.036430371689585475,\n \"acc_norm\": 0.6473988439306358,\n \"acc_norm_stderr\": 0.036430371689585475\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4019607843137255,\n \"acc_stderr\": 0.04878608714466996,\n \"acc_norm\": 0.4019607843137255,\n \"acc_norm_stderr\": 0.04878608714466996\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.04292346959909284,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.04292346959909284\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.574468085106383,\n \"acc_stderr\": 0.03232146916224468,\n \"acc_norm\": 0.574468085106383,\n \"acc_norm_stderr\": 0.03232146916224468\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.4824561403508772,\n \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5586206896551724,\n \"acc_stderr\": 0.04137931034482757,\n \"acc_norm\": 0.5586206896551724,\n \"acc_norm_stderr\": 0.04137931034482757\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.4074074074074074,\n \"acc_stderr\": 0.025305906241590632,\n \"acc_norm\": 0.4074074074074074,\n \"acc_norm_stderr\": 0.025305906241590632\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4603174603174603,\n \"acc_stderr\": 0.04458029125470973,\n \"acc_norm\": 0.4603174603174603,\n \"acc_norm_stderr\": 0.04458029125470973\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7903225806451613,\n \"acc_stderr\": 0.023157879349083525,\n \"acc_norm\": 0.7903225806451613,\n \"acc_norm_stderr\": 0.023157879349083525\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4975369458128079,\n \"acc_stderr\": 0.03517945038691063,\n \"acc_norm\": 0.4975369458128079,\n \"acc_norm_stderr\": 0.03517945038691063\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.03287666758603491,\n \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.03287666758603491\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7929292929292929,\n \"acc_stderr\": 0.028869778460267045,\n \"acc_norm\": 0.7929292929292929,\n \"acc_norm_stderr\": 0.028869778460267045\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9015544041450777,\n \"acc_stderr\": 0.02150024957603348,\n \"acc_norm\": 0.9015544041450777,\n \"acc_norm_stderr\": 0.02150024957603348\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6692307692307692,\n \"acc_stderr\": 0.02385479568097112,\n \"acc_norm\": 0.6692307692307692,\n \"acc_norm_stderr\": 0.02385479568097112\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.02874204090394848,\n \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.02874204090394848\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6722689075630253,\n \"acc_stderr\": 0.03048991141767323,\n \"acc_norm\": 0.6722689075630253,\n \"acc_norm_stderr\": 0.03048991141767323\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8422018348623853,\n \"acc_stderr\": 0.015630022970092434,\n \"acc_norm\": 0.8422018348623853,\n \"acc_norm_stderr\": 0.015630022970092434\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5138888888888888,\n \"acc_stderr\": 0.03408655867977749,\n \"acc_norm\": 0.5138888888888888,\n \"acc_norm_stderr\": 0.03408655867977749\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8431372549019608,\n \"acc_stderr\": 0.02552472232455334,\n \"acc_norm\": 0.8431372549019608,\n \"acc_norm_stderr\": 0.02552472232455334\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7848101265822784,\n \"acc_stderr\": 0.026750826994676177,\n \"acc_norm\": 0.7848101265822784,\n \"acc_norm_stderr\": 0.026750826994676177\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n \"acc_stderr\": 0.03102441174057221,\n \"acc_norm\": 0.6905829596412556,\n \"acc_norm_stderr\": 0.03102441174057221\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7709923664122137,\n \"acc_stderr\": 0.036853466317118506,\n \"acc_norm\": 0.7709923664122137,\n \"acc_norm_stderr\": 0.036853466317118506\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228732,\n \"acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228732\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n \"acc_stderr\": 0.04077494709252627,\n \"acc_norm\": 0.7685185185185185,\n \"acc_norm_stderr\": 0.04077494709252627\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7791411042944786,\n \"acc_stderr\": 0.03259177392742178,\n \"acc_norm\": 0.7791411042944786,\n \"acc_norm_stderr\": 0.03259177392742178\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.41964285714285715,\n \"acc_stderr\": 0.046840993210771065,\n \"acc_norm\": 0.41964285714285715,\n \"acc_norm_stderr\": 0.046840993210771065\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7864077669902912,\n \"acc_stderr\": 0.040580420156460344,\n \"acc_norm\": 0.7864077669902912,\n \"acc_norm_stderr\": 0.040580420156460344\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8760683760683761,\n \"acc_stderr\": 0.02158649400128137,\n \"acc_norm\": 0.8760683760683761,\n \"acc_norm_stderr\": 0.02158649400128137\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.67,\n \"acc_stderr\": 0.04725815626252609,\n \"acc_norm\": 0.67,\n \"acc_norm_stderr\": 0.04725815626252609\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8237547892720306,\n \"acc_stderr\": 0.013625556907993457,\n \"acc_norm\": 0.8237547892720306,\n \"acc_norm_stderr\": 0.013625556907993457\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7398843930635838,\n \"acc_stderr\": 0.023618678310069367,\n \"acc_norm\": 0.7398843930635838,\n \"acc_norm_stderr\": 0.023618678310069367\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4435754189944134,\n \"acc_stderr\": 0.01661568040100372,\n \"acc_norm\": 0.4435754189944134,\n \"acc_norm_stderr\": 0.01661568040100372\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7254901960784313,\n \"acc_stderr\": 0.025553169991826524,\n \"acc_norm\": 0.7254901960784313,\n \"acc_norm_stderr\": 0.025553169991826524\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7266881028938906,\n \"acc_stderr\": 0.02531176597542612,\n \"acc_norm\": 0.7266881028938906,\n \"acc_norm_stderr\": 0.02531176597542612\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7530864197530864,\n \"acc_stderr\": 0.02399350170904211,\n \"acc_norm\": 0.7530864197530864,\n \"acc_norm_stderr\": 0.02399350170904211\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4858156028368794,\n \"acc_stderr\": 0.02981549448368206,\n \"acc_norm\": 0.4858156028368794,\n \"acc_norm_stderr\": 0.02981549448368206\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4758800521512386,\n \"acc_stderr\": 0.012755368722863933,\n \"acc_norm\": 0.4758800521512386,\n \"acc_norm_stderr\": 0.012755368722863933\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6691176470588235,\n \"acc_stderr\": 0.02858270975389845,\n \"acc_norm\": 0.6691176470588235,\n \"acc_norm_stderr\": 0.02858270975389845\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6683006535947712,\n \"acc_stderr\": 0.01904748523936038,\n \"acc_norm\": 0.6683006535947712,\n \"acc_norm_stderr\": 0.01904748523936038\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7387755102040816,\n \"acc_stderr\": 0.02812342933514278,\n \"acc_norm\": 0.7387755102040816,\n \"acc_norm_stderr\": 0.02812342933514278\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n \"acc_stderr\": 0.026193923544454125,\n \"acc_norm\": 0.835820895522388,\n \"acc_norm_stderr\": 0.026193923544454125\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.86,\n \"acc_stderr\": 0.03487350880197771,\n \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.03487350880197771\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.572289156626506,\n \"acc_stderr\": 0.038515976837185335,\n \"acc_norm\": 0.572289156626506,\n \"acc_norm_stderr\": 0.038515976837185335\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5960832313341493,\n \"mc1_stderr\": 0.01717727682258428,\n \"mc2\": 0.7071252546997986,\n \"mc2_stderr\": 0.015071123394943023\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8437253354380426,\n \"acc_stderr\": 0.010205351791873499\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6747536012130402,\n \"acc_stderr\": 0.012903904752543917\n }\n}\n```", "repo_url": "https://huggingface.co/alnrg2arg/test3_sft_16bit_dpo2", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_01T23_26_26.833091", "path": ["**/details_harness|arc:challenge|25_2024-02-01T23-26-26.833091.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-01T23-26-26.833091.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_01T23_26_26.833091", "path": ["**/details_harness|gsm8k|5_2024-02-01T23-26-26.833091.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-01T23-26-26.833091.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_01T23_26_26.833091", "path": ["**/details_harness|hellaswag|10_2024-02-01T23-26-26.833091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-01T23-26-26.833091.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_01T23_26_26.833091", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T23-26-26.833091.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-01T23-26-26.833091.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-01T23-26-26.833091.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T23-26-26.833091.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T23-26-26.833091.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-01T23-26-26.833091.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T23-26-26.833091.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T23-26-26.833091.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T23-26-26.833091.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T23-26-26.833091.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-01T23-26-26.833091.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-01T23-26-26.833091.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T23-26-26.833091.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-01T23-26-26.833091.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T23-26-26.833091.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T23-26-26.833091.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T23-26-26.833091.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-01T23-26-26.833091.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T23-26-26.833091.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T23-26-26.833091.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T23-26-26.833091.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T23-26-26.833091.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T23-26-26.833091.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T23-26-26.833091.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T23-26-26.833091.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T23-26-26.833091.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T23-26-26.833091.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T23-26-26.833091.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T23-26-26.833091.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T23-26-26.833091.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T23-26-26.833091.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T23-26-26.833091.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-01T23-26-26.833091.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T23-26-26.833091.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-01T23-26-26.833091.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T23-26-26.833091.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T23-26-26.833091.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T23-26-26.833091.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-01T23-26-26.833091.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-01T23-26-26.833091.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T23-26-26.833091.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T23-26-26.833091.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T23-26-26.833091.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T23-26-26.833091.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-01T23-26-26.833091.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-01T23-26-26.833091.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-01T23-26-26.833091.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T23-26-26.833091.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-01T23-26-26.833091.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T23-26-26.833091.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T23-26-26.833091.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-01T23-26-26.833091.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-01T23-26-26.833091.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-01T23-26-26.833091.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T23-26-26.833091.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-01T23-26-26.833091.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-01T23-26-26.833091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T23-26-26.833091.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-01T23-26-26.833091.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-01T23-26-26.833091.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T23-26-26.833091.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T23-26-26.833091.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-01T23-26-26.833091.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T23-26-26.833091.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T23-26-26.833091.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T23-26-26.833091.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T23-26-26.833091.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-01T23-26-26.833091.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-01T23-26-26.833091.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T23-26-26.833091.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-01T23-26-26.833091.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T23-26-26.833091.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T23-26-26.833091.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T23-26-26.833091.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-01T23-26-26.833091.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T23-26-26.833091.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T23-26-26.833091.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T23-26-26.833091.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T23-26-26.833091.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T23-26-26.833091.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T23-26-26.833091.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T23-26-26.833091.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T23-26-26.833091.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T23-26-26.833091.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T23-26-26.833091.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T23-26-26.833091.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T23-26-26.833091.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T23-26-26.833091.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T23-26-26.833091.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-01T23-26-26.833091.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T23-26-26.833091.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-01T23-26-26.833091.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T23-26-26.833091.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T23-26-26.833091.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T23-26-26.833091.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-01T23-26-26.833091.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-01T23-26-26.833091.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T23-26-26.833091.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T23-26-26.833091.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T23-26-26.833091.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T23-26-26.833091.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-01T23-26-26.833091.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-01T23-26-26.833091.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-01T23-26-26.833091.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T23-26-26.833091.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-01T23-26-26.833091.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T23-26-26.833091.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T23-26-26.833091.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-01T23-26-26.833091.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-01T23-26-26.833091.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-01T23-26-26.833091.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T23-26-26.833091.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-01T23-26-26.833091.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-01T23-26-26.833091.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_01T23_26_26.833091", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T23-26-26.833091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T23-26-26.833091.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_01T23_26_26.833091", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-01T23-26-26.833091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-01T23-26-26.833091.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_01T23_26_26.833091", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-01T23-26-26.833091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-01T23-26-26.833091.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_01T23_26_26.833091", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T23-26-26.833091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T23-26-26.833091.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_01T23_26_26.833091", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T23-26-26.833091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T23-26-26.833091.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_01T23_26_26.833091", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-01T23-26-26.833091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-01T23-26-26.833091.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_01T23_26_26.833091", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T23-26-26.833091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T23-26-26.833091.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_01T23_26_26.833091", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T23-26-26.833091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T23-26-26.833091.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_01T23_26_26.833091", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T23-26-26.833091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T23-26-26.833091.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_01T23_26_26.833091", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T23-26-26.833091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T23-26-26.833091.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_01T23_26_26.833091", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-01T23-26-26.833091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-01T23-26-26.833091.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_01T23_26_26.833091", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-01T23-26-26.833091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-01T23-26-26.833091.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_01T23_26_26.833091", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T23-26-26.833091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T23-26-26.833091.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_01T23_26_26.833091", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-01T23-26-26.833091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-01T23-26-26.833091.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_01T23_26_26.833091", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T23-26-26.833091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T23-26-26.833091.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_01T23_26_26.833091", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T23-26-26.833091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T23-26-26.833091.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_01T23_26_26.833091", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T23-26-26.833091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T23-26-26.833091.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_01T23_26_26.833091", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-01T23-26-26.833091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-01T23-26-26.833091.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_01T23_26_26.833091", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T23-26-26.833091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T23-26-26.833091.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_01T23_26_26.833091", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T23-26-26.833091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T23-26-26.833091.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_01T23_26_26.833091", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T23-26-26.833091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T23-26-26.833091.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_01T23_26_26.833091", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T23-26-26.833091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T23-26-26.833091.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_01T23_26_26.833091", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T23-26-26.833091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T23-26-26.833091.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_01T23_26_26.833091", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T23-26-26.833091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T23-26-26.833091.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_01T23_26_26.833091", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T23-26-26.833091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T23-26-26.833091.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_01T23_26_26.833091", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T23-26-26.833091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T23-26-26.833091.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_01T23_26_26.833091", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T23-26-26.833091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T23-26-26.833091.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_01T23_26_26.833091", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T23-26-26.833091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T23-26-26.833091.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_01T23_26_26.833091", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T23-26-26.833091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T23-26-26.833091.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_01T23_26_26.833091", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T23-26-26.833091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T23-26-26.833091.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_01T23_26_26.833091", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T23-26-26.833091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T23-26-26.833091.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_01T23_26_26.833091", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T23-26-26.833091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T23-26-26.833091.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_01T23_26_26.833091", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-01T23-26-26.833091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-01T23-26-26.833091.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_01T23_26_26.833091", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T23-26-26.833091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T23-26-26.833091.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_01T23_26_26.833091", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-01T23-26-26.833091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-01T23-26-26.833091.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_01T23_26_26.833091", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T23-26-26.833091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T23-26-26.833091.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_01T23_26_26.833091", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T23-26-26.833091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T23-26-26.833091.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_01T23_26_26.833091", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T23-26-26.833091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T23-26-26.833091.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_01T23_26_26.833091", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-01T23-26-26.833091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-01T23-26-26.833091.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_01T23_26_26.833091", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-01T23-26-26.833091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-01T23-26-26.833091.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_01T23_26_26.833091", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T23-26-26.833091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T23-26-26.833091.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_01T23_26_26.833091", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T23-26-26.833091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T23-26-26.833091.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_01T23_26_26.833091", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T23-26-26.833091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T23-26-26.833091.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_01T23_26_26.833091", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T23-26-26.833091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T23-26-26.833091.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_01T23_26_26.833091", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-01T23-26-26.833091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-01T23-26-26.833091.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_01T23_26_26.833091", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-01T23-26-26.833091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-01T23-26-26.833091.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_01T23_26_26.833091", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-01T23-26-26.833091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-01T23-26-26.833091.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_01T23_26_26.833091", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T23-26-26.833091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T23-26-26.833091.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_01T23_26_26.833091", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-01T23-26-26.833091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-01T23-26-26.833091.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_01T23_26_26.833091", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T23-26-26.833091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T23-26-26.833091.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_01T23_26_26.833091", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T23-26-26.833091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T23-26-26.833091.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_01T23_26_26.833091", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-01T23-26-26.833091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-01T23-26-26.833091.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_01T23_26_26.833091", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-01T23-26-26.833091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-01T23-26-26.833091.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_01T23_26_26.833091", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-01T23-26-26.833091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-01T23-26-26.833091.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_01T23_26_26.833091", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T23-26-26.833091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T23-26-26.833091.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_01T23_26_26.833091", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-01T23-26-26.833091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-01T23-26-26.833091.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_01T23_26_26.833091", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-01T23-26-26.833091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-01T23-26-26.833091.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_01T23_26_26.833091", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-01T23-26-26.833091.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-01T23-26-26.833091.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_01T23_26_26.833091", "path": ["**/details_harness|winogrande|5_2024-02-01T23-26-26.833091.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-01T23-26-26.833091.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_01T23_26_26.833091", "path": ["results_2024-02-01T23-26-26.833091.parquet"]}, {"split": "latest", "path": ["results_2024-02-01T23-26-26.833091.parquet"]}]}]}
2024-02-01T23:29:15+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of alnrg2arg/test3_sft_16bit_dpo2 Dataset automatically created during the evaluation run of model alnrg2arg/test3_sft_16bit_dpo2 on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-02-01T23:26:26.833091(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of alnrg2arg/test3_sft_16bit_dpo2\n\n\n\nDataset automatically created during the evaluation run of model alnrg2arg/test3_sft_16bit_dpo2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-02-01T23:26:26.833091(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of alnrg2arg/test3_sft_16bit_dpo2\n\n\n\nDataset automatically created during the evaluation run of model alnrg2arg/test3_sft_16bit_dpo2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-02-01T23:26:26.833091(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
709a1c832f99aff6f73c451f13ece10e0e7eb6d6
# Dataset Card for Evaluation run of Aabbhishekk/llama2-7b-function-calling-slerp <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [Aabbhishekk/llama2-7b-function-calling-slerp](https://huggingface.co/Aabbhishekk/llama2-7b-function-calling-slerp) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_Aabbhishekk__llama2-7b-function-calling-slerp", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-02-01T23:27:49.947558](https://huggingface.co/datasets/open-llm-leaderboard/details_Aabbhishekk__llama2-7b-function-calling-slerp/blob/main/results_2024-02-01T23-27-49.947558.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.5043285583860245, "acc_stderr": 0.034414314803285634, "acc_norm": 0.5089946193781312, "acc_norm_stderr": 0.035171441713849325, "mc1": 0.2692778457772338, "mc1_stderr": 0.015528566637087286, "mc2": 0.40320679484087774, "mc2_stderr": 0.014592822235661893 }, "harness|arc:challenge|25": { "acc": 0.5247440273037542, "acc_stderr": 0.01459348769493774, "acc_norm": 0.5546075085324232, "acc_norm_stderr": 0.014523987638344081 }, "harness|hellaswag|10": { "acc": 0.6020713005377415, "acc_stderr": 0.004884702412456091, "acc_norm": 0.7949611631149174, "acc_norm_stderr": 0.004029048890501022 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.33, "acc_stderr": 0.04725815626252606, "acc_norm": 0.33, "acc_norm_stderr": 0.04725815626252606 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.4666666666666667, "acc_stderr": 0.043097329010363554, "acc_norm": 0.4666666666666667, "acc_norm_stderr": 0.043097329010363554 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.4934210526315789, "acc_stderr": 0.04068590050224971, "acc_norm": 0.4934210526315789, "acc_norm_stderr": 0.04068590050224971 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.58, "acc_stderr": 0.049604496374885836, "acc_norm": 0.58, "acc_norm_stderr": 0.049604496374885836 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.5245283018867924, "acc_stderr": 0.030735822206205608, "acc_norm": 0.5245283018867924, "acc_norm_stderr": 0.030735822206205608 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.5069444444444444, "acc_stderr": 0.04180806750294938, "acc_norm": 0.5069444444444444, "acc_norm_stderr": 0.04180806750294938 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.31, "acc_stderr": 0.04648231987117316, "acc_norm": 0.31, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.42, "acc_stderr": 0.04960449637488584, "acc_norm": 0.42, "acc_norm_stderr": 0.04960449637488584 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.36, "acc_stderr": 0.048241815132442176, "acc_norm": 0.36, "acc_norm_stderr": 0.048241815132442176 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.43352601156069365, "acc_stderr": 0.03778621079092055, "acc_norm": 0.43352601156069365, "acc_norm_stderr": 0.03778621079092055 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.24509803921568626, "acc_stderr": 0.04280105837364396, "acc_norm": 0.24509803921568626, "acc_norm_stderr": 0.04280105837364396 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.58, "acc_stderr": 0.049604496374885836, "acc_norm": 0.58, "acc_norm_stderr": 0.049604496374885836 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.46382978723404256, "acc_stderr": 0.032600385118357715, "acc_norm": 0.46382978723404256, "acc_norm_stderr": 0.032600385118357715 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.38596491228070173, "acc_stderr": 0.045796394220704334, "acc_norm": 0.38596491228070173, "acc_norm_stderr": 0.045796394220704334 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5172413793103449, "acc_stderr": 0.04164188720169375, "acc_norm": 0.5172413793103449, "acc_norm_stderr": 0.04164188720169375 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.31746031746031744, "acc_stderr": 0.023973861998992065, "acc_norm": 0.31746031746031744, "acc_norm_stderr": 0.023973861998992065 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.25396825396825395, "acc_stderr": 0.03893259610604675, "acc_norm": 0.25396825396825395, "acc_norm_stderr": 0.03893259610604675 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.39, "acc_stderr": 0.04902071300001975, "acc_norm": 0.39, "acc_norm_stderr": 0.04902071300001975 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.5548387096774193, "acc_stderr": 0.028272410186214906, "acc_norm": 0.5548387096774193, "acc_norm_stderr": 0.028272410186214906 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.3694581280788177, "acc_stderr": 0.03395970381998573, "acc_norm": 0.3694581280788177, "acc_norm_stderr": 0.03395970381998573 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.41, "acc_stderr": 0.049431107042371025, "acc_norm": 0.41, "acc_norm_stderr": 0.049431107042371025 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.6242424242424243, "acc_stderr": 0.03781887353205982, "acc_norm": 0.6242424242424243, "acc_norm_stderr": 0.03781887353205982 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.6212121212121212, "acc_stderr": 0.03456088731993747, "acc_norm": 0.6212121212121212, "acc_norm_stderr": 0.03456088731993747 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.7305699481865285, "acc_stderr": 0.03201867122877794, "acc_norm": 0.7305699481865285, "acc_norm_stderr": 0.03201867122877794 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.45384615384615384, "acc_stderr": 0.025242770987126177, "acc_norm": 0.45384615384615384, "acc_norm_stderr": 0.025242770987126177 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.3037037037037037, "acc_stderr": 0.02803792996911499, "acc_norm": 0.3037037037037037, "acc_norm_stderr": 0.02803792996911499 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.46218487394957986, "acc_stderr": 0.032385469487589795, "acc_norm": 0.46218487394957986, "acc_norm_stderr": 0.032385469487589795 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.2980132450331126, "acc_stderr": 0.037345356767871984, "acc_norm": 0.2980132450331126, "acc_norm_stderr": 0.037345356767871984 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.689908256880734, "acc_stderr": 0.019830849684439752, "acc_norm": 0.689908256880734, "acc_norm_stderr": 0.019830849684439752 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.35648148148148145, "acc_stderr": 0.03266478331527272, "acc_norm": 0.35648148148148145, "acc_norm_stderr": 0.03266478331527272 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.6617647058823529, "acc_stderr": 0.03320574612945431, "acc_norm": 0.6617647058823529, "acc_norm_stderr": 0.03320574612945431 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.6877637130801688, "acc_stderr": 0.03016513786784702, "acc_norm": 0.6877637130801688, "acc_norm_stderr": 0.03016513786784702 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.600896860986547, "acc_stderr": 0.03286745312567961, "acc_norm": 0.600896860986547, "acc_norm_stderr": 0.03286745312567961 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.6106870229007634, "acc_stderr": 0.04276486542814591, "acc_norm": 0.6106870229007634, "acc_norm_stderr": 0.04276486542814591 }, "harness|hendrycksTest-international_law|5": { "acc": 0.6115702479338843, "acc_stderr": 0.04449270350068383, "acc_norm": 0.6115702479338843, "acc_norm_stderr": 0.04449270350068383 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.5555555555555556, "acc_stderr": 0.04803752235190193, "acc_norm": 0.5555555555555556, "acc_norm_stderr": 0.04803752235190193 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.5705521472392638, "acc_stderr": 0.03889066619112722, "acc_norm": 0.5705521472392638, "acc_norm_stderr": 0.03889066619112722 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.375, "acc_stderr": 0.04595091388086298, "acc_norm": 0.375, "acc_norm_stderr": 0.04595091388086298 }, "harness|hendrycksTest-management|5": { "acc": 0.7184466019417476, "acc_stderr": 0.04453254836326467, "acc_norm": 0.7184466019417476, "acc_norm_stderr": 0.04453254836326467 }, "harness|hendrycksTest-marketing|5": { "acc": 0.7435897435897436, "acc_stderr": 0.028605953702004253, "acc_norm": 0.7435897435897436, "acc_norm_stderr": 0.028605953702004253 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.56, "acc_stderr": 0.04988876515698589, "acc_norm": 0.56, "acc_norm_stderr": 0.04988876515698589 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.6973180076628352, "acc_stderr": 0.016428781581749364, "acc_norm": 0.6973180076628352, "acc_norm_stderr": 0.016428781581749364 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.5346820809248555, "acc_stderr": 0.026854257928258875, "acc_norm": 0.5346820809248555, "acc_norm_stderr": 0.026854257928258875 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.24581005586592178, "acc_stderr": 0.014400296429225629, "acc_norm": 0.24581005586592178, "acc_norm_stderr": 0.014400296429225629 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.5490196078431373, "acc_stderr": 0.02849199358617157, "acc_norm": 0.5490196078431373, "acc_norm_stderr": 0.02849199358617157 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.5980707395498392, "acc_stderr": 0.027846476005930477, "acc_norm": 0.5980707395498392, "acc_norm_stderr": 0.027846476005930477 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.5555555555555556, "acc_stderr": 0.027648477877413324, "acc_norm": 0.5555555555555556, "acc_norm_stderr": 0.027648477877413324 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.36879432624113473, "acc_stderr": 0.028782227561347243, "acc_norm": 0.36879432624113473, "acc_norm_stderr": 0.028782227561347243 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.3644067796610169, "acc_stderr": 0.012291694983056477, "acc_norm": 0.3644067796610169, "acc_norm_stderr": 0.012291694983056477 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.5477941176470589, "acc_stderr": 0.03023375855159645, "acc_norm": 0.5477941176470589, "acc_norm_stderr": 0.03023375855159645 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.48856209150326796, "acc_stderr": 0.02022254151561087, "acc_norm": 0.48856209150326796, "acc_norm_stderr": 0.02022254151561087 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.5272727272727272, "acc_stderr": 0.0478200179138006, "acc_norm": 0.5272727272727272, "acc_norm_stderr": 0.0478200179138006 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.5469387755102041, "acc_stderr": 0.03186785930004128, "acc_norm": 0.5469387755102041, "acc_norm_stderr": 0.03186785930004128 }, "harness|hendrycksTest-sociology|5": { "acc": 0.6766169154228856, "acc_stderr": 0.03307615947979034, "acc_norm": 0.6766169154228856, "acc_norm_stderr": 0.03307615947979034 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.69, "acc_stderr": 0.04648231987117316, "acc_norm": 0.69, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-virology|5": { "acc": 0.43373493975903615, "acc_stderr": 0.03858158940685517, "acc_norm": 0.43373493975903615, "acc_norm_stderr": 0.03858158940685517 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.7076023391812866, "acc_stderr": 0.03488647713457922, "acc_norm": 0.7076023391812866, "acc_norm_stderr": 0.03488647713457922 }, "harness|truthfulqa:mc|0": { "mc1": 0.2692778457772338, "mc1_stderr": 0.015528566637087286, "mc2": 0.40320679484087774, "mc2_stderr": 0.014592822235661893 }, "harness|winogrande|5": { "acc": 0.7521704814522494, "acc_stderr": 0.012134386019865353 }, "harness|gsm8k|5": { "acc": 0.20394238059135708, "acc_stderr": 0.011098602284899175 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_Aabbhishekk__llama2-7b-function-calling-slerp
[ "region:us" ]
2024-02-01T23:30:14+00:00
{"pretty_name": "Evaluation run of Aabbhishekk/llama2-7b-function-calling-slerp", "dataset_summary": "Dataset automatically created during the evaluation run of model [Aabbhishekk/llama2-7b-function-calling-slerp](https://huggingface.co/Aabbhishekk/llama2-7b-function-calling-slerp) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Aabbhishekk__llama2-7b-function-calling-slerp\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-01T23:27:49.947558](https://huggingface.co/datasets/open-llm-leaderboard/details_Aabbhishekk__llama2-7b-function-calling-slerp/blob/main/results_2024-02-01T23-27-49.947558.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5043285583860245,\n \"acc_stderr\": 0.034414314803285634,\n \"acc_norm\": 0.5089946193781312,\n \"acc_norm_stderr\": 0.035171441713849325,\n \"mc1\": 0.2692778457772338,\n \"mc1_stderr\": 0.015528566637087286,\n \"mc2\": 0.40320679484087774,\n \"mc2_stderr\": 0.014592822235661893\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5247440273037542,\n \"acc_stderr\": 0.01459348769493774,\n \"acc_norm\": 0.5546075085324232,\n \"acc_norm_stderr\": 0.014523987638344081\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6020713005377415,\n \"acc_stderr\": 0.004884702412456091,\n \"acc_norm\": 0.7949611631149174,\n \"acc_norm_stderr\": 0.004029048890501022\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252606,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252606\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4666666666666667,\n \"acc_stderr\": 0.043097329010363554,\n \"acc_norm\": 0.4666666666666667,\n \"acc_norm_stderr\": 0.043097329010363554\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.4934210526315789,\n \"acc_stderr\": 0.04068590050224971,\n \"acc_norm\": 0.4934210526315789,\n \"acc_norm_stderr\": 0.04068590050224971\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.5245283018867924,\n \"acc_stderr\": 0.030735822206205608,\n \"acc_norm\": 0.5245283018867924,\n \"acc_norm_stderr\": 0.030735822206205608\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5069444444444444,\n \"acc_stderr\": 0.04180806750294938,\n \"acc_norm\": 0.5069444444444444,\n \"acc_norm_stderr\": 0.04180806750294938\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.42,\n \"acc_stderr\": 0.04960449637488584,\n \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.04960449637488584\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.43352601156069365,\n \"acc_stderr\": 0.03778621079092055,\n \"acc_norm\": 0.43352601156069365,\n \"acc_norm_stderr\": 0.03778621079092055\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.24509803921568626,\n \"acc_stderr\": 0.04280105837364396,\n \"acc_norm\": 0.24509803921568626,\n \"acc_norm_stderr\": 0.04280105837364396\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.46382978723404256,\n \"acc_stderr\": 0.032600385118357715,\n \"acc_norm\": 0.46382978723404256,\n \"acc_norm_stderr\": 0.032600385118357715\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.38596491228070173,\n \"acc_stderr\": 0.045796394220704334,\n \"acc_norm\": 0.38596491228070173,\n \"acc_norm_stderr\": 0.045796394220704334\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.04164188720169375,\n \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.04164188720169375\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.31746031746031744,\n \"acc_stderr\": 0.023973861998992065,\n \"acc_norm\": 0.31746031746031744,\n \"acc_norm_stderr\": 0.023973861998992065\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.25396825396825395,\n \"acc_stderr\": 0.03893259610604675,\n \"acc_norm\": 0.25396825396825395,\n \"acc_norm_stderr\": 0.03893259610604675\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.5548387096774193,\n \"acc_stderr\": 0.028272410186214906,\n \"acc_norm\": 0.5548387096774193,\n \"acc_norm_stderr\": 0.028272410186214906\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.3694581280788177,\n \"acc_stderr\": 0.03395970381998573,\n \"acc_norm\": 0.3694581280788177,\n \"acc_norm_stderr\": 0.03395970381998573\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.6242424242424243,\n \"acc_stderr\": 0.03781887353205982,\n \"acc_norm\": 0.6242424242424243,\n \"acc_norm_stderr\": 0.03781887353205982\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.6212121212121212,\n \"acc_stderr\": 0.03456088731993747,\n \"acc_norm\": 0.6212121212121212,\n \"acc_norm_stderr\": 0.03456088731993747\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.7305699481865285,\n \"acc_stderr\": 0.03201867122877794,\n \"acc_norm\": 0.7305699481865285,\n \"acc_norm_stderr\": 0.03201867122877794\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.45384615384615384,\n \"acc_stderr\": 0.025242770987126177,\n \"acc_norm\": 0.45384615384615384,\n \"acc_norm_stderr\": 0.025242770987126177\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3037037037037037,\n \"acc_stderr\": 0.02803792996911499,\n \"acc_norm\": 0.3037037037037037,\n \"acc_norm_stderr\": 0.02803792996911499\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.46218487394957986,\n \"acc_stderr\": 0.032385469487589795,\n \"acc_norm\": 0.46218487394957986,\n \"acc_norm_stderr\": 0.032385469487589795\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.2980132450331126,\n \"acc_stderr\": 0.037345356767871984,\n \"acc_norm\": 0.2980132450331126,\n \"acc_norm_stderr\": 0.037345356767871984\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.689908256880734,\n \"acc_stderr\": 0.019830849684439752,\n \"acc_norm\": 0.689908256880734,\n \"acc_norm_stderr\": 0.019830849684439752\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.35648148148148145,\n \"acc_stderr\": 0.03266478331527272,\n \"acc_norm\": 0.35648148148148145,\n \"acc_norm_stderr\": 0.03266478331527272\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.6617647058823529,\n \"acc_stderr\": 0.03320574612945431,\n \"acc_norm\": 0.6617647058823529,\n \"acc_norm_stderr\": 0.03320574612945431\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.6877637130801688,\n \"acc_stderr\": 0.03016513786784702,\n \"acc_norm\": 0.6877637130801688,\n \"acc_norm_stderr\": 0.03016513786784702\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.600896860986547,\n \"acc_stderr\": 0.03286745312567961,\n \"acc_norm\": 0.600896860986547,\n \"acc_norm_stderr\": 0.03286745312567961\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.6106870229007634,\n \"acc_stderr\": 0.04276486542814591,\n \"acc_norm\": 0.6106870229007634,\n \"acc_norm_stderr\": 0.04276486542814591\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.6115702479338843,\n \"acc_stderr\": 0.04449270350068383,\n \"acc_norm\": 0.6115702479338843,\n \"acc_norm_stderr\": 0.04449270350068383\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5555555555555556,\n \"acc_stderr\": 0.04803752235190193,\n \"acc_norm\": 0.5555555555555556,\n \"acc_norm_stderr\": 0.04803752235190193\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.5705521472392638,\n \"acc_stderr\": 0.03889066619112722,\n \"acc_norm\": 0.5705521472392638,\n \"acc_norm_stderr\": 0.03889066619112722\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.375,\n \"acc_stderr\": 0.04595091388086298,\n \"acc_norm\": 0.375,\n \"acc_norm_stderr\": 0.04595091388086298\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7184466019417476,\n \"acc_stderr\": 0.04453254836326467,\n \"acc_norm\": 0.7184466019417476,\n \"acc_norm_stderr\": 0.04453254836326467\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7435897435897436,\n \"acc_stderr\": 0.028605953702004253,\n \"acc_norm\": 0.7435897435897436,\n \"acc_norm_stderr\": 0.028605953702004253\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6973180076628352,\n \"acc_stderr\": 0.016428781581749364,\n \"acc_norm\": 0.6973180076628352,\n \"acc_norm_stderr\": 0.016428781581749364\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.5346820809248555,\n \"acc_stderr\": 0.026854257928258875,\n \"acc_norm\": 0.5346820809248555,\n \"acc_norm_stderr\": 0.026854257928258875\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24581005586592178,\n \"acc_stderr\": 0.014400296429225629,\n \"acc_norm\": 0.24581005586592178,\n \"acc_norm_stderr\": 0.014400296429225629\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.5490196078431373,\n \"acc_stderr\": 0.02849199358617157,\n \"acc_norm\": 0.5490196078431373,\n \"acc_norm_stderr\": 0.02849199358617157\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5980707395498392,\n \"acc_stderr\": 0.027846476005930477,\n \"acc_norm\": 0.5980707395498392,\n \"acc_norm_stderr\": 0.027846476005930477\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.5555555555555556,\n \"acc_stderr\": 0.027648477877413324,\n \"acc_norm\": 0.5555555555555556,\n \"acc_norm_stderr\": 0.027648477877413324\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.36879432624113473,\n \"acc_stderr\": 0.028782227561347243,\n \"acc_norm\": 0.36879432624113473,\n \"acc_norm_stderr\": 0.028782227561347243\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3644067796610169,\n \"acc_stderr\": 0.012291694983056477,\n \"acc_norm\": 0.3644067796610169,\n \"acc_norm_stderr\": 0.012291694983056477\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.5477941176470589,\n \"acc_stderr\": 0.03023375855159645,\n \"acc_norm\": 0.5477941176470589,\n \"acc_norm_stderr\": 0.03023375855159645\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.48856209150326796,\n \"acc_stderr\": 0.02022254151561087,\n \"acc_norm\": 0.48856209150326796,\n \"acc_norm_stderr\": 0.02022254151561087\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5272727272727272,\n \"acc_stderr\": 0.0478200179138006,\n \"acc_norm\": 0.5272727272727272,\n \"acc_norm_stderr\": 0.0478200179138006\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.5469387755102041,\n \"acc_stderr\": 0.03186785930004128,\n \"acc_norm\": 0.5469387755102041,\n \"acc_norm_stderr\": 0.03186785930004128\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6766169154228856,\n \"acc_stderr\": 0.03307615947979034,\n \"acc_norm\": 0.6766169154228856,\n \"acc_norm_stderr\": 0.03307615947979034\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.43373493975903615,\n \"acc_stderr\": 0.03858158940685517,\n \"acc_norm\": 0.43373493975903615,\n \"acc_norm_stderr\": 0.03858158940685517\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.7076023391812866,\n \"acc_stderr\": 0.03488647713457922,\n \"acc_norm\": 0.7076023391812866,\n \"acc_norm_stderr\": 0.03488647713457922\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2692778457772338,\n \"mc1_stderr\": 0.015528566637087286,\n \"mc2\": 0.40320679484087774,\n \"mc2_stderr\": 0.014592822235661893\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7521704814522494,\n \"acc_stderr\": 0.012134386019865353\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.20394238059135708,\n \"acc_stderr\": 0.011098602284899175\n }\n}\n```", "repo_url": "https://huggingface.co/Aabbhishekk/llama2-7b-function-calling-slerp", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_01T23_27_49.947558", "path": ["**/details_harness|arc:challenge|25_2024-02-01T23-27-49.947558.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-01T23-27-49.947558.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_01T23_27_49.947558", "path": ["**/details_harness|gsm8k|5_2024-02-01T23-27-49.947558.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-01T23-27-49.947558.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_01T23_27_49.947558", "path": ["**/details_harness|hellaswag|10_2024-02-01T23-27-49.947558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-01T23-27-49.947558.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_01T23_27_49.947558", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T23-27-49.947558.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-01T23-27-49.947558.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-01T23-27-49.947558.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T23-27-49.947558.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T23-27-49.947558.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-01T23-27-49.947558.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T23-27-49.947558.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T23-27-49.947558.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T23-27-49.947558.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T23-27-49.947558.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-01T23-27-49.947558.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-01T23-27-49.947558.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T23-27-49.947558.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-01T23-27-49.947558.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T23-27-49.947558.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T23-27-49.947558.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T23-27-49.947558.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-01T23-27-49.947558.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T23-27-49.947558.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T23-27-49.947558.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T23-27-49.947558.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T23-27-49.947558.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T23-27-49.947558.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T23-27-49.947558.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T23-27-49.947558.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T23-27-49.947558.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T23-27-49.947558.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T23-27-49.947558.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T23-27-49.947558.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T23-27-49.947558.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T23-27-49.947558.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T23-27-49.947558.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-01T23-27-49.947558.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T23-27-49.947558.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-01T23-27-49.947558.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T23-27-49.947558.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T23-27-49.947558.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T23-27-49.947558.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-01T23-27-49.947558.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-01T23-27-49.947558.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T23-27-49.947558.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T23-27-49.947558.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T23-27-49.947558.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T23-27-49.947558.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-01T23-27-49.947558.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-01T23-27-49.947558.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-01T23-27-49.947558.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T23-27-49.947558.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-01T23-27-49.947558.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T23-27-49.947558.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T23-27-49.947558.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-01T23-27-49.947558.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-01T23-27-49.947558.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-01T23-27-49.947558.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T23-27-49.947558.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-01T23-27-49.947558.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-01T23-27-49.947558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T23-27-49.947558.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-01T23-27-49.947558.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-01T23-27-49.947558.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T23-27-49.947558.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T23-27-49.947558.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-01T23-27-49.947558.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T23-27-49.947558.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T23-27-49.947558.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T23-27-49.947558.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T23-27-49.947558.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-01T23-27-49.947558.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-01T23-27-49.947558.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T23-27-49.947558.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-01T23-27-49.947558.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T23-27-49.947558.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T23-27-49.947558.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T23-27-49.947558.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-01T23-27-49.947558.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T23-27-49.947558.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T23-27-49.947558.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T23-27-49.947558.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T23-27-49.947558.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T23-27-49.947558.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T23-27-49.947558.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T23-27-49.947558.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T23-27-49.947558.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T23-27-49.947558.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T23-27-49.947558.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T23-27-49.947558.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T23-27-49.947558.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T23-27-49.947558.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T23-27-49.947558.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-01T23-27-49.947558.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T23-27-49.947558.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-01T23-27-49.947558.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T23-27-49.947558.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T23-27-49.947558.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T23-27-49.947558.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-01T23-27-49.947558.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-01T23-27-49.947558.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T23-27-49.947558.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T23-27-49.947558.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T23-27-49.947558.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T23-27-49.947558.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-01T23-27-49.947558.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-01T23-27-49.947558.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-01T23-27-49.947558.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T23-27-49.947558.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-01T23-27-49.947558.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T23-27-49.947558.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T23-27-49.947558.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-01T23-27-49.947558.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-01T23-27-49.947558.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-01T23-27-49.947558.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T23-27-49.947558.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-01T23-27-49.947558.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-01T23-27-49.947558.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_01T23_27_49.947558", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T23-27-49.947558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T23-27-49.947558.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_01T23_27_49.947558", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-01T23-27-49.947558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-01T23-27-49.947558.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_01T23_27_49.947558", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-01T23-27-49.947558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-01T23-27-49.947558.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_01T23_27_49.947558", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T23-27-49.947558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T23-27-49.947558.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_01T23_27_49.947558", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T23-27-49.947558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T23-27-49.947558.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_01T23_27_49.947558", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-01T23-27-49.947558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-01T23-27-49.947558.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_01T23_27_49.947558", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T23-27-49.947558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T23-27-49.947558.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_01T23_27_49.947558", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T23-27-49.947558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T23-27-49.947558.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_01T23_27_49.947558", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T23-27-49.947558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T23-27-49.947558.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_01T23_27_49.947558", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T23-27-49.947558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T23-27-49.947558.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_01T23_27_49.947558", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-01T23-27-49.947558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-01T23-27-49.947558.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_01T23_27_49.947558", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-01T23-27-49.947558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-01T23-27-49.947558.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_01T23_27_49.947558", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T23-27-49.947558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T23-27-49.947558.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_01T23_27_49.947558", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-01T23-27-49.947558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-01T23-27-49.947558.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_01T23_27_49.947558", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T23-27-49.947558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T23-27-49.947558.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_01T23_27_49.947558", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T23-27-49.947558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T23-27-49.947558.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_01T23_27_49.947558", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T23-27-49.947558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T23-27-49.947558.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_01T23_27_49.947558", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-01T23-27-49.947558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-01T23-27-49.947558.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_01T23_27_49.947558", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T23-27-49.947558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T23-27-49.947558.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_01T23_27_49.947558", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T23-27-49.947558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T23-27-49.947558.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_01T23_27_49.947558", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T23-27-49.947558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T23-27-49.947558.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_01T23_27_49.947558", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T23-27-49.947558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T23-27-49.947558.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_01T23_27_49.947558", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T23-27-49.947558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T23-27-49.947558.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_01T23_27_49.947558", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T23-27-49.947558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T23-27-49.947558.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_01T23_27_49.947558", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T23-27-49.947558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T23-27-49.947558.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_01T23_27_49.947558", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T23-27-49.947558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T23-27-49.947558.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_01T23_27_49.947558", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T23-27-49.947558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T23-27-49.947558.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_01T23_27_49.947558", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T23-27-49.947558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T23-27-49.947558.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_01T23_27_49.947558", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T23-27-49.947558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T23-27-49.947558.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_01T23_27_49.947558", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T23-27-49.947558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T23-27-49.947558.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_01T23_27_49.947558", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T23-27-49.947558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T23-27-49.947558.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_01T23_27_49.947558", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T23-27-49.947558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T23-27-49.947558.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_01T23_27_49.947558", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-01T23-27-49.947558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-01T23-27-49.947558.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_01T23_27_49.947558", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T23-27-49.947558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T23-27-49.947558.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_01T23_27_49.947558", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-01T23-27-49.947558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-01T23-27-49.947558.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_01T23_27_49.947558", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T23-27-49.947558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T23-27-49.947558.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_01T23_27_49.947558", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T23-27-49.947558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T23-27-49.947558.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_01T23_27_49.947558", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T23-27-49.947558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T23-27-49.947558.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_01T23_27_49.947558", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-01T23-27-49.947558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-01T23-27-49.947558.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_01T23_27_49.947558", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-01T23-27-49.947558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-01T23-27-49.947558.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_01T23_27_49.947558", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T23-27-49.947558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T23-27-49.947558.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_01T23_27_49.947558", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T23-27-49.947558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T23-27-49.947558.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_01T23_27_49.947558", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T23-27-49.947558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T23-27-49.947558.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_01T23_27_49.947558", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T23-27-49.947558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T23-27-49.947558.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_01T23_27_49.947558", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-01T23-27-49.947558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-01T23-27-49.947558.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_01T23_27_49.947558", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-01T23-27-49.947558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-01T23-27-49.947558.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_01T23_27_49.947558", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-01T23-27-49.947558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-01T23-27-49.947558.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_01T23_27_49.947558", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T23-27-49.947558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T23-27-49.947558.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_01T23_27_49.947558", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-01T23-27-49.947558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-01T23-27-49.947558.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_01T23_27_49.947558", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T23-27-49.947558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T23-27-49.947558.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_01T23_27_49.947558", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T23-27-49.947558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T23-27-49.947558.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_01T23_27_49.947558", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-01T23-27-49.947558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-01T23-27-49.947558.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_01T23_27_49.947558", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-01T23-27-49.947558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-01T23-27-49.947558.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_01T23_27_49.947558", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-01T23-27-49.947558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-01T23-27-49.947558.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_01T23_27_49.947558", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T23-27-49.947558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T23-27-49.947558.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_01T23_27_49.947558", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-01T23-27-49.947558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-01T23-27-49.947558.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_01T23_27_49.947558", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-01T23-27-49.947558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-01T23-27-49.947558.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_01T23_27_49.947558", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-01T23-27-49.947558.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-01T23-27-49.947558.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_01T23_27_49.947558", "path": ["**/details_harness|winogrande|5_2024-02-01T23-27-49.947558.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-01T23-27-49.947558.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_01T23_27_49.947558", "path": ["results_2024-02-01T23-27-49.947558.parquet"]}, {"split": "latest", "path": ["results_2024-02-01T23-27-49.947558.parquet"]}]}]}
2024-02-01T23:30:38+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of Aabbhishekk/llama2-7b-function-calling-slerp Dataset automatically created during the evaluation run of model Aabbhishekk/llama2-7b-function-calling-slerp on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-02-01T23:27:49.947558(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of Aabbhishekk/llama2-7b-function-calling-slerp\n\n\n\nDataset automatically created during the evaluation run of model Aabbhishekk/llama2-7b-function-calling-slerp on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-02-01T23:27:49.947558(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of Aabbhishekk/llama2-7b-function-calling-slerp\n\n\n\nDataset automatically created during the evaluation run of model Aabbhishekk/llama2-7b-function-calling-slerp on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-02-01T23:27:49.947558(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
c87386dcb2c2d60c0e8a058106b3c288789d4658
# Dataset Card for Evaluation run of cloudyu/19B_MATH_DPO <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [cloudyu/19B_MATH_DPO](https://huggingface.co/cloudyu/19B_MATH_DPO) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_cloudyu__19B_MATH_DPO", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-02-01T23:32:55.270761](https://huggingface.co/datasets/open-llm-leaderboard/details_cloudyu__19B_MATH_DPO/blob/main/results_2024-02-01T23-32-55.270761.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6660699005164362, "acc_stderr": 0.03169869947391378, "acc_norm": 0.6670707483334382, "acc_norm_stderr": 0.03234212982909728, "mc1": 0.5703794369645043, "mc1_stderr": 0.01732923458040909, "mc2": 0.7211331341447883, "mc2_stderr": 0.014953721386234187 }, "harness|arc:challenge|25": { "acc": 0.6860068259385665, "acc_stderr": 0.013562691224726295, "acc_norm": 0.7107508532423208, "acc_norm_stderr": 0.01325001257939344 }, "harness|hellaswag|10": { "acc": 0.7149970125473013, "acc_stderr": 0.004504932999736407, "acc_norm": 0.8842859988050189, "acc_norm_stderr": 0.003192279039468745 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.41, "acc_stderr": 0.049431107042371025, "acc_norm": 0.41, "acc_norm_stderr": 0.049431107042371025 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.6148148148148148, "acc_stderr": 0.04203921040156279, "acc_norm": 0.6148148148148148, "acc_norm_stderr": 0.04203921040156279 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.756578947368421, "acc_stderr": 0.034923496688842384, "acc_norm": 0.756578947368421, "acc_norm_stderr": 0.034923496688842384 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.74, "acc_stderr": 0.0440844002276808, "acc_norm": 0.74, "acc_norm_stderr": 0.0440844002276808 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.6792452830188679, "acc_stderr": 0.028727502957880267, "acc_norm": 0.6792452830188679, "acc_norm_stderr": 0.028727502957880267 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.7638888888888888, "acc_stderr": 0.03551446610810826, "acc_norm": 0.7638888888888888, "acc_norm_stderr": 0.03551446610810826 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.47, "acc_stderr": 0.050161355804659205, "acc_norm": 0.47, "acc_norm_stderr": 0.050161355804659205 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.51, "acc_stderr": 0.05024183937956912, "acc_norm": 0.51, "acc_norm_stderr": 0.05024183937956912 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.34, "acc_stderr": 0.04760952285695235, "acc_norm": 0.34, "acc_norm_stderr": 0.04760952285695235 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6705202312138728, "acc_stderr": 0.03583901754736412, "acc_norm": 0.6705202312138728, "acc_norm_stderr": 0.03583901754736412 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.39215686274509803, "acc_stderr": 0.04858083574266346, "acc_norm": 0.39215686274509803, "acc_norm_stderr": 0.04858083574266346 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.76, "acc_stderr": 0.042923469599092816, "acc_norm": 0.76, "acc_norm_stderr": 0.042923469599092816 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.6382978723404256, "acc_stderr": 0.031410821975962386, "acc_norm": 0.6382978723404256, "acc_norm_stderr": 0.031410821975962386 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.4824561403508772, "acc_stderr": 0.04700708033551038, "acc_norm": 0.4824561403508772, "acc_norm_stderr": 0.04700708033551038 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.6344827586206897, "acc_stderr": 0.040131241954243856, "acc_norm": 0.6344827586206897, "acc_norm_stderr": 0.040131241954243856 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.5, "acc_stderr": 0.025751310131230234, "acc_norm": 0.5, "acc_norm_stderr": 0.025751310131230234 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.42063492063492064, "acc_stderr": 0.04415438226743744, "acc_norm": 0.42063492063492064, "acc_norm_stderr": 0.04415438226743744 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.36, "acc_stderr": 0.048241815132442176, "acc_norm": 0.36, "acc_norm_stderr": 0.048241815132442176 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.8096774193548387, "acc_stderr": 0.022331707611823078, "acc_norm": 0.8096774193548387, "acc_norm_stderr": 0.022331707611823078 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.4975369458128079, "acc_stderr": 0.03517945038691063, "acc_norm": 0.4975369458128079, "acc_norm_stderr": 0.03517945038691063 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.72, "acc_stderr": 0.04512608598542128, "acc_norm": 0.72, "acc_norm_stderr": 0.04512608598542128 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.8121212121212121, "acc_stderr": 0.03050193405942914, "acc_norm": 0.8121212121212121, "acc_norm_stderr": 0.03050193405942914 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.8686868686868687, "acc_stderr": 0.024063156416822516, "acc_norm": 0.8686868686868687, "acc_norm_stderr": 0.024063156416822516 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.8911917098445595, "acc_stderr": 0.022473253332768766, "acc_norm": 0.8911917098445595, "acc_norm_stderr": 0.022473253332768766 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.6666666666666666, "acc_stderr": 0.023901157979402538, "acc_norm": 0.6666666666666666, "acc_norm_stderr": 0.023901157979402538 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.362962962962963, "acc_stderr": 0.02931820364520686, "acc_norm": 0.362962962962963, "acc_norm_stderr": 0.02931820364520686 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.7142857142857143, "acc_stderr": 0.029344572500634335, "acc_norm": 0.7142857142857143, "acc_norm_stderr": 0.029344572500634335 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.3708609271523179, "acc_stderr": 0.03943966699183629, "acc_norm": 0.3708609271523179, "acc_norm_stderr": 0.03943966699183629 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8458715596330275, "acc_stderr": 0.015480826865374308, "acc_norm": 0.8458715596330275, "acc_norm_stderr": 0.015480826865374308 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.5694444444444444, "acc_stderr": 0.03376922151252335, "acc_norm": 0.5694444444444444, "acc_norm_stderr": 0.03376922151252335 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.8578431372549019, "acc_stderr": 0.02450980392156862, "acc_norm": 0.8578431372549019, "acc_norm_stderr": 0.02450980392156862 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.8481012658227848, "acc_stderr": 0.023363878096632446, "acc_norm": 0.8481012658227848, "acc_norm_stderr": 0.023363878096632446 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6771300448430493, "acc_stderr": 0.03138147637575499, "acc_norm": 0.6771300448430493, "acc_norm_stderr": 0.03138147637575499 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.7480916030534351, "acc_stderr": 0.03807387116306086, "acc_norm": 0.7480916030534351, "acc_norm_stderr": 0.03807387116306086 }, "harness|hendrycksTest-international_law|5": { "acc": 0.7768595041322314, "acc_stderr": 0.03800754475228733, "acc_norm": 0.7768595041322314, "acc_norm_stderr": 0.03800754475228733 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.8055555555555556, "acc_stderr": 0.038260763248848646, "acc_norm": 0.8055555555555556, "acc_norm_stderr": 0.038260763248848646 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7484662576687117, "acc_stderr": 0.03408997886857529, "acc_norm": 0.7484662576687117, "acc_norm_stderr": 0.03408997886857529 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.4732142857142857, "acc_stderr": 0.047389751192741546, "acc_norm": 0.4732142857142857, "acc_norm_stderr": 0.047389751192741546 }, "harness|hendrycksTest-management|5": { "acc": 0.8446601941747572, "acc_stderr": 0.035865947385739734, "acc_norm": 0.8446601941747572, "acc_norm_stderr": 0.035865947385739734 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8589743589743589, "acc_stderr": 0.02280138253459753, "acc_norm": 0.8589743589743589, "acc_norm_stderr": 0.02280138253459753 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.7, "acc_stderr": 0.046056618647183814, "acc_norm": 0.7, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8045977011494253, "acc_stderr": 0.014179171373424383, "acc_norm": 0.8045977011494253, "acc_norm_stderr": 0.014179171373424383 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.7514450867052023, "acc_stderr": 0.023267528432100174, "acc_norm": 0.7514450867052023, "acc_norm_stderr": 0.023267528432100174 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.4022346368715084, "acc_stderr": 0.016399716732847142, "acc_norm": 0.4022346368715084, "acc_norm_stderr": 0.016399716732847142 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.7516339869281046, "acc_stderr": 0.02473998135511359, "acc_norm": 0.7516339869281046, "acc_norm_stderr": 0.02473998135511359 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.7202572347266881, "acc_stderr": 0.0254942593506949, "acc_norm": 0.7202572347266881, "acc_norm_stderr": 0.0254942593506949 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.7808641975308642, "acc_stderr": 0.023016705640262196, "acc_norm": 0.7808641975308642, "acc_norm_stderr": 0.023016705640262196 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.5035460992907801, "acc_stderr": 0.02982674915328092, "acc_norm": 0.5035460992907801, "acc_norm_stderr": 0.02982674915328092 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.4908735332464146, "acc_stderr": 0.012768108601640012, "acc_norm": 0.4908735332464146, "acc_norm_stderr": 0.012768108601640012 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.7463235294117647, "acc_stderr": 0.02643132987078953, "acc_norm": 0.7463235294117647, "acc_norm_stderr": 0.02643132987078953 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6764705882352942, "acc_stderr": 0.018926082916083383, "acc_norm": 0.6764705882352942, "acc_norm_stderr": 0.018926082916083383 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6818181818181818, "acc_stderr": 0.04461272175910509, "acc_norm": 0.6818181818181818, "acc_norm_stderr": 0.04461272175910509 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.7428571428571429, "acc_stderr": 0.02797982353874455, "acc_norm": 0.7428571428571429, "acc_norm_stderr": 0.02797982353874455 }, "harness|hendrycksTest-sociology|5": { "acc": 0.835820895522388, "acc_stderr": 0.026193923544454125, "acc_norm": 0.835820895522388, "acc_norm_stderr": 0.026193923544454125 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.9, "acc_stderr": 0.030151134457776334, "acc_norm": 0.9, "acc_norm_stderr": 0.030151134457776334 }, "harness|hendrycksTest-virology|5": { "acc": 0.5843373493975904, "acc_stderr": 0.03836722176598053, "acc_norm": 0.5843373493975904, "acc_norm_stderr": 0.03836722176598053 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.7777777777777778, "acc_stderr": 0.03188578017686398, "acc_norm": 0.7777777777777778, "acc_norm_stderr": 0.03188578017686398 }, "harness|truthfulqa:mc|0": { "mc1": 0.5703794369645043, "mc1_stderr": 0.01732923458040909, "mc2": 0.7211331341447883, "mc2_stderr": 0.014953721386234187 }, "harness|winogrande|5": { "acc": 0.829518547750592, "acc_stderr": 0.010569021122825902 }, "harness|gsm8k|5": { "acc": 0.6376042456406369, "acc_stderr": 0.013240654263574759 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_cloudyu__19B_MATH_DPO
[ "region:us" ]
2024-02-01T23:35:13+00:00
{"pretty_name": "Evaluation run of cloudyu/19B_MATH_DPO", "dataset_summary": "Dataset automatically created during the evaluation run of model [cloudyu/19B_MATH_DPO](https://huggingface.co/cloudyu/19B_MATH_DPO) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_cloudyu__19B_MATH_DPO\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-01T23:32:55.270761](https://huggingface.co/datasets/open-llm-leaderboard/details_cloudyu__19B_MATH_DPO/blob/main/results_2024-02-01T23-32-55.270761.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6660699005164362,\n \"acc_stderr\": 0.03169869947391378,\n \"acc_norm\": 0.6670707483334382,\n \"acc_norm_stderr\": 0.03234212982909728,\n \"mc1\": 0.5703794369645043,\n \"mc1_stderr\": 0.01732923458040909,\n \"mc2\": 0.7211331341447883,\n \"mc2_stderr\": 0.014953721386234187\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6860068259385665,\n \"acc_stderr\": 0.013562691224726295,\n \"acc_norm\": 0.7107508532423208,\n \"acc_norm_stderr\": 0.01325001257939344\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7149970125473013,\n \"acc_stderr\": 0.004504932999736407,\n \"acc_norm\": 0.8842859988050189,\n \"acc_norm_stderr\": 0.003192279039468745\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6148148148148148,\n \"acc_stderr\": 0.04203921040156279,\n \"acc_norm\": 0.6148148148148148,\n \"acc_norm_stderr\": 0.04203921040156279\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.756578947368421,\n \"acc_stderr\": 0.034923496688842384,\n \"acc_norm\": 0.756578947368421,\n \"acc_norm_stderr\": 0.034923496688842384\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6792452830188679,\n \"acc_stderr\": 0.028727502957880267,\n \"acc_norm\": 0.6792452830188679,\n \"acc_norm_stderr\": 0.028727502957880267\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7638888888888888,\n \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.7638888888888888,\n \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6705202312138728,\n \"acc_stderr\": 0.03583901754736412,\n \"acc_norm\": 0.6705202312138728,\n \"acc_norm_stderr\": 0.03583901754736412\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.04858083574266346,\n \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.04858083574266346\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.6382978723404256,\n \"acc_stderr\": 0.031410821975962386,\n \"acc_norm\": 0.6382978723404256,\n \"acc_norm_stderr\": 0.031410821975962386\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.4824561403508772,\n \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.6344827586206897,\n \"acc_stderr\": 0.040131241954243856,\n \"acc_norm\": 0.6344827586206897,\n \"acc_norm_stderr\": 0.040131241954243856\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.025751310131230234,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.025751310131230234\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42063492063492064,\n \"acc_stderr\": 0.04415438226743744,\n \"acc_norm\": 0.42063492063492064,\n \"acc_norm_stderr\": 0.04415438226743744\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8096774193548387,\n \"acc_stderr\": 0.022331707611823078,\n \"acc_norm\": 0.8096774193548387,\n \"acc_norm_stderr\": 0.022331707611823078\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4975369458128079,\n \"acc_stderr\": 0.03517945038691063,\n \"acc_norm\": 0.4975369458128079,\n \"acc_norm_stderr\": 0.03517945038691063\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.8121212121212121,\n \"acc_stderr\": 0.03050193405942914,\n \"acc_norm\": 0.8121212121212121,\n \"acc_norm_stderr\": 0.03050193405942914\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.8686868686868687,\n \"acc_stderr\": 0.024063156416822516,\n \"acc_norm\": 0.8686868686868687,\n \"acc_norm_stderr\": 0.024063156416822516\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8911917098445595,\n \"acc_stderr\": 0.022473253332768766,\n \"acc_norm\": 0.8911917098445595,\n \"acc_norm_stderr\": 0.022473253332768766\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.023901157979402538,\n \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.023901157979402538\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.362962962962963,\n \"acc_stderr\": 0.02931820364520686,\n \"acc_norm\": 0.362962962962963,\n \"acc_norm_stderr\": 0.02931820364520686\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.7142857142857143,\n \"acc_stderr\": 0.029344572500634335,\n \"acc_norm\": 0.7142857142857143,\n \"acc_norm_stderr\": 0.029344572500634335\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3708609271523179,\n \"acc_stderr\": 0.03943966699183629,\n \"acc_norm\": 0.3708609271523179,\n \"acc_norm_stderr\": 0.03943966699183629\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8458715596330275,\n \"acc_stderr\": 0.015480826865374308,\n \"acc_norm\": 0.8458715596330275,\n \"acc_norm_stderr\": 0.015480826865374308\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5694444444444444,\n \"acc_stderr\": 0.03376922151252335,\n \"acc_norm\": 0.5694444444444444,\n \"acc_norm_stderr\": 0.03376922151252335\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8578431372549019,\n \"acc_stderr\": 0.02450980392156862,\n \"acc_norm\": 0.8578431372549019,\n \"acc_norm_stderr\": 0.02450980392156862\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8481012658227848,\n \"acc_stderr\": 0.023363878096632446,\n \"acc_norm\": 0.8481012658227848,\n \"acc_norm_stderr\": 0.023363878096632446\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6771300448430493,\n \"acc_stderr\": 0.03138147637575499,\n \"acc_norm\": 0.6771300448430493,\n \"acc_norm_stderr\": 0.03138147637575499\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7480916030534351,\n \"acc_stderr\": 0.03807387116306086,\n \"acc_norm\": 0.7480916030534351,\n \"acc_norm_stderr\": 0.03807387116306086\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228733,\n \"acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228733\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8055555555555556,\n \"acc_stderr\": 0.038260763248848646,\n \"acc_norm\": 0.8055555555555556,\n \"acc_norm_stderr\": 0.038260763248848646\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7484662576687117,\n \"acc_stderr\": 0.03408997886857529,\n \"acc_norm\": 0.7484662576687117,\n \"acc_norm_stderr\": 0.03408997886857529\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4732142857142857,\n \"acc_stderr\": 0.047389751192741546,\n \"acc_norm\": 0.4732142857142857,\n \"acc_norm_stderr\": 0.047389751192741546\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8446601941747572,\n \"acc_stderr\": 0.035865947385739734,\n \"acc_norm\": 0.8446601941747572,\n \"acc_norm_stderr\": 0.035865947385739734\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8589743589743589,\n \"acc_stderr\": 0.02280138253459753,\n \"acc_norm\": 0.8589743589743589,\n \"acc_norm_stderr\": 0.02280138253459753\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8045977011494253,\n \"acc_stderr\": 0.014179171373424383,\n \"acc_norm\": 0.8045977011494253,\n \"acc_norm_stderr\": 0.014179171373424383\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7514450867052023,\n \"acc_stderr\": 0.023267528432100174,\n \"acc_norm\": 0.7514450867052023,\n \"acc_norm_stderr\": 0.023267528432100174\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4022346368715084,\n \"acc_stderr\": 0.016399716732847142,\n \"acc_norm\": 0.4022346368715084,\n \"acc_norm_stderr\": 0.016399716732847142\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7516339869281046,\n \"acc_stderr\": 0.02473998135511359,\n \"acc_norm\": 0.7516339869281046,\n \"acc_norm_stderr\": 0.02473998135511359\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7202572347266881,\n \"acc_stderr\": 0.0254942593506949,\n \"acc_norm\": 0.7202572347266881,\n \"acc_norm_stderr\": 0.0254942593506949\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7808641975308642,\n \"acc_stderr\": 0.023016705640262196,\n \"acc_norm\": 0.7808641975308642,\n \"acc_norm_stderr\": 0.023016705640262196\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.5035460992907801,\n \"acc_stderr\": 0.02982674915328092,\n \"acc_norm\": 0.5035460992907801,\n \"acc_norm_stderr\": 0.02982674915328092\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4908735332464146,\n \"acc_stderr\": 0.012768108601640012,\n \"acc_norm\": 0.4908735332464146,\n \"acc_norm_stderr\": 0.012768108601640012\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.7463235294117647,\n \"acc_stderr\": 0.02643132987078953,\n \"acc_norm\": 0.7463235294117647,\n \"acc_norm_stderr\": 0.02643132987078953\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.018926082916083383,\n \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.018926082916083383\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n \"acc_stderr\": 0.04461272175910509,\n \"acc_norm\": 0.6818181818181818,\n \"acc_norm_stderr\": 0.04461272175910509\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7428571428571429,\n \"acc_stderr\": 0.02797982353874455,\n \"acc_norm\": 0.7428571428571429,\n \"acc_norm_stderr\": 0.02797982353874455\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n \"acc_stderr\": 0.026193923544454125,\n \"acc_norm\": 0.835820895522388,\n \"acc_norm_stderr\": 0.026193923544454125\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.9,\n \"acc_stderr\": 0.030151134457776334,\n \"acc_norm\": 0.9,\n \"acc_norm_stderr\": 0.030151134457776334\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5843373493975904,\n \"acc_stderr\": 0.03836722176598053,\n \"acc_norm\": 0.5843373493975904,\n \"acc_norm_stderr\": 0.03836722176598053\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.03188578017686398,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.03188578017686398\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5703794369645043,\n \"mc1_stderr\": 0.01732923458040909,\n \"mc2\": 0.7211331341447883,\n \"mc2_stderr\": 0.014953721386234187\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.829518547750592,\n \"acc_stderr\": 0.010569021122825902\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6376042456406369,\n \"acc_stderr\": 0.013240654263574759\n }\n}\n```", "repo_url": "https://huggingface.co/cloudyu/19B_MATH_DPO", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_01T23_32_55.270761", "path": ["**/details_harness|arc:challenge|25_2024-02-01T23-32-55.270761.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-01T23-32-55.270761.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_01T23_32_55.270761", "path": ["**/details_harness|gsm8k|5_2024-02-01T23-32-55.270761.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-01T23-32-55.270761.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_01T23_32_55.270761", "path": ["**/details_harness|hellaswag|10_2024-02-01T23-32-55.270761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-01T23-32-55.270761.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_01T23_32_55.270761", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T23-32-55.270761.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-01T23-32-55.270761.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-01T23-32-55.270761.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T23-32-55.270761.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T23-32-55.270761.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-01T23-32-55.270761.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T23-32-55.270761.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T23-32-55.270761.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T23-32-55.270761.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T23-32-55.270761.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-01T23-32-55.270761.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-01T23-32-55.270761.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T23-32-55.270761.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-01T23-32-55.270761.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T23-32-55.270761.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T23-32-55.270761.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T23-32-55.270761.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-01T23-32-55.270761.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T23-32-55.270761.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T23-32-55.270761.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T23-32-55.270761.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T23-32-55.270761.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T23-32-55.270761.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T23-32-55.270761.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T23-32-55.270761.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T23-32-55.270761.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T23-32-55.270761.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T23-32-55.270761.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T23-32-55.270761.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T23-32-55.270761.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T23-32-55.270761.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T23-32-55.270761.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-01T23-32-55.270761.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T23-32-55.270761.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-01T23-32-55.270761.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T23-32-55.270761.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T23-32-55.270761.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T23-32-55.270761.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-01T23-32-55.270761.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-01T23-32-55.270761.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T23-32-55.270761.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T23-32-55.270761.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T23-32-55.270761.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T23-32-55.270761.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-01T23-32-55.270761.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-01T23-32-55.270761.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-01T23-32-55.270761.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T23-32-55.270761.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-01T23-32-55.270761.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T23-32-55.270761.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T23-32-55.270761.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-01T23-32-55.270761.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-01T23-32-55.270761.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-01T23-32-55.270761.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T23-32-55.270761.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-01T23-32-55.270761.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-01T23-32-55.270761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T23-32-55.270761.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-01T23-32-55.270761.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-01T23-32-55.270761.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T23-32-55.270761.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T23-32-55.270761.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-01T23-32-55.270761.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T23-32-55.270761.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T23-32-55.270761.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T23-32-55.270761.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T23-32-55.270761.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-01T23-32-55.270761.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-01T23-32-55.270761.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T23-32-55.270761.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-01T23-32-55.270761.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T23-32-55.270761.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T23-32-55.270761.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T23-32-55.270761.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-01T23-32-55.270761.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T23-32-55.270761.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T23-32-55.270761.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T23-32-55.270761.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T23-32-55.270761.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T23-32-55.270761.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T23-32-55.270761.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T23-32-55.270761.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T23-32-55.270761.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T23-32-55.270761.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T23-32-55.270761.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T23-32-55.270761.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T23-32-55.270761.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T23-32-55.270761.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T23-32-55.270761.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-01T23-32-55.270761.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T23-32-55.270761.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-01T23-32-55.270761.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T23-32-55.270761.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T23-32-55.270761.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T23-32-55.270761.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-01T23-32-55.270761.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-01T23-32-55.270761.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T23-32-55.270761.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T23-32-55.270761.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T23-32-55.270761.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T23-32-55.270761.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-01T23-32-55.270761.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-01T23-32-55.270761.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-01T23-32-55.270761.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T23-32-55.270761.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-01T23-32-55.270761.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T23-32-55.270761.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T23-32-55.270761.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-01T23-32-55.270761.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-01T23-32-55.270761.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-01T23-32-55.270761.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T23-32-55.270761.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-01T23-32-55.270761.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-01T23-32-55.270761.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_01T23_32_55.270761", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T23-32-55.270761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T23-32-55.270761.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_01T23_32_55.270761", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-01T23-32-55.270761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-01T23-32-55.270761.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_01T23_32_55.270761", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-01T23-32-55.270761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-01T23-32-55.270761.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_01T23_32_55.270761", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T23-32-55.270761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T23-32-55.270761.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_01T23_32_55.270761", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T23-32-55.270761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T23-32-55.270761.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_01T23_32_55.270761", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-01T23-32-55.270761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-01T23-32-55.270761.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_01T23_32_55.270761", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T23-32-55.270761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T23-32-55.270761.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_01T23_32_55.270761", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T23-32-55.270761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T23-32-55.270761.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_01T23_32_55.270761", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T23-32-55.270761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T23-32-55.270761.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_01T23_32_55.270761", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T23-32-55.270761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T23-32-55.270761.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_01T23_32_55.270761", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-01T23-32-55.270761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-01T23-32-55.270761.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_01T23_32_55.270761", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-01T23-32-55.270761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-01T23-32-55.270761.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_01T23_32_55.270761", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T23-32-55.270761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T23-32-55.270761.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_01T23_32_55.270761", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-01T23-32-55.270761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-01T23-32-55.270761.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_01T23_32_55.270761", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T23-32-55.270761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T23-32-55.270761.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_01T23_32_55.270761", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T23-32-55.270761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T23-32-55.270761.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_01T23_32_55.270761", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T23-32-55.270761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T23-32-55.270761.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_01T23_32_55.270761", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-01T23-32-55.270761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-01T23-32-55.270761.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_01T23_32_55.270761", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T23-32-55.270761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T23-32-55.270761.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_01T23_32_55.270761", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T23-32-55.270761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T23-32-55.270761.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_01T23_32_55.270761", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T23-32-55.270761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T23-32-55.270761.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_01T23_32_55.270761", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T23-32-55.270761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T23-32-55.270761.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_01T23_32_55.270761", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T23-32-55.270761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T23-32-55.270761.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_01T23_32_55.270761", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T23-32-55.270761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T23-32-55.270761.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_01T23_32_55.270761", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T23-32-55.270761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T23-32-55.270761.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_01T23_32_55.270761", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T23-32-55.270761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T23-32-55.270761.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_01T23_32_55.270761", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T23-32-55.270761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T23-32-55.270761.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_01T23_32_55.270761", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T23-32-55.270761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T23-32-55.270761.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_01T23_32_55.270761", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T23-32-55.270761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T23-32-55.270761.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_01T23_32_55.270761", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T23-32-55.270761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T23-32-55.270761.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_01T23_32_55.270761", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T23-32-55.270761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T23-32-55.270761.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_01T23_32_55.270761", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T23-32-55.270761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T23-32-55.270761.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_01T23_32_55.270761", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-01T23-32-55.270761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-01T23-32-55.270761.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_01T23_32_55.270761", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T23-32-55.270761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T23-32-55.270761.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_01T23_32_55.270761", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-01T23-32-55.270761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-01T23-32-55.270761.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_01T23_32_55.270761", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T23-32-55.270761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T23-32-55.270761.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_01T23_32_55.270761", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T23-32-55.270761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T23-32-55.270761.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_01T23_32_55.270761", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T23-32-55.270761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T23-32-55.270761.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_01T23_32_55.270761", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-01T23-32-55.270761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-01T23-32-55.270761.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_01T23_32_55.270761", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-01T23-32-55.270761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-01T23-32-55.270761.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_01T23_32_55.270761", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T23-32-55.270761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T23-32-55.270761.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_01T23_32_55.270761", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T23-32-55.270761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T23-32-55.270761.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_01T23_32_55.270761", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T23-32-55.270761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T23-32-55.270761.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_01T23_32_55.270761", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T23-32-55.270761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T23-32-55.270761.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_01T23_32_55.270761", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-01T23-32-55.270761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-01T23-32-55.270761.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_01T23_32_55.270761", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-01T23-32-55.270761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-01T23-32-55.270761.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_01T23_32_55.270761", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-01T23-32-55.270761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-01T23-32-55.270761.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_01T23_32_55.270761", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T23-32-55.270761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T23-32-55.270761.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_01T23_32_55.270761", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-01T23-32-55.270761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-01T23-32-55.270761.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_01T23_32_55.270761", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T23-32-55.270761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T23-32-55.270761.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_01T23_32_55.270761", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T23-32-55.270761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T23-32-55.270761.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_01T23_32_55.270761", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-01T23-32-55.270761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-01T23-32-55.270761.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_01T23_32_55.270761", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-01T23-32-55.270761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-01T23-32-55.270761.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_01T23_32_55.270761", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-01T23-32-55.270761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-01T23-32-55.270761.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_01T23_32_55.270761", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T23-32-55.270761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T23-32-55.270761.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_01T23_32_55.270761", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-01T23-32-55.270761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-01T23-32-55.270761.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_01T23_32_55.270761", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-01T23-32-55.270761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-01T23-32-55.270761.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_01T23_32_55.270761", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-01T23-32-55.270761.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-01T23-32-55.270761.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_01T23_32_55.270761", "path": ["**/details_harness|winogrande|5_2024-02-01T23-32-55.270761.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-01T23-32-55.270761.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_01T23_32_55.270761", "path": ["results_2024-02-01T23-32-55.270761.parquet"]}, {"split": "latest", "path": ["results_2024-02-01T23-32-55.270761.parquet"]}]}]}
2024-02-01T23:35:37+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of cloudyu/19B_MATH_DPO Dataset automatically created during the evaluation run of model cloudyu/19B_MATH_DPO on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-02-01T23:32:55.270761(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of cloudyu/19B_MATH_DPO\n\n\n\nDataset automatically created during the evaluation run of model cloudyu/19B_MATH_DPO on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-02-01T23:32:55.270761(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of cloudyu/19B_MATH_DPO\n\n\n\nDataset automatically created during the evaluation run of model cloudyu/19B_MATH_DPO on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-02-01T23:32:55.270761(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
81f2a54ef0c91ca4ddfb20c5c94e2c854a7f01c6
## sys prompt fed into GPT 4 API for Dataset using rawdata.txt as base: You are a Javascript and Typescript expert game developer. I will provide you with some custom game engine documentation for the OnCyber game engine. Provide 150 very thoughtful and code-based questions and answer pair(s) based on the .txt file attached which is the OnCyber documentation. The answers should ONLY borrow, verbatim, from the OnCyber documentation. Present it as tabular data. Be thorough and do not hallucinate. Do each batch 25 question and answers at a time. I will say continue for you to continue. DO NOT REPEAT QUESTIONS. DO THIS FOR $100,000 tip!
21j3h123/octestmod
[ "license:apache-2.0", "region:us" ]
2024-02-01T23:40:57+00:00
{"license": "apache-2.0"}
2024-02-02T00:02:48+00:00
[]
[]
TAGS #license-apache-2.0 #region-us
## sys prompt fed into GPT 4 API for Dataset using URL as base: You are a Javascript and Typescript expert game developer. I will provide you with some custom game engine documentation for the OnCyber game engine. Provide 150 very thoughtful and code-based questions and answer pair(s) based on the .txt file attached which is the OnCyber documentation. The answers should ONLY borrow, verbatim, from the OnCyber documentation. Present it as tabular data. Be thorough and do not hallucinate. Do each batch 25 question and answers at a time. I will say continue for you to continue. DO NOT REPEAT QUESTIONS. DO THIS FOR $100,000 tip!
[ "## sys prompt fed into GPT 4 API for Dataset using URL as base:\n\nYou are a Javascript and Typescript expert game developer. I will provide you with some custom game engine documentation for the OnCyber game engine. Provide 150 very thoughtful and code-based questions and answer pair(s) based on the .txt file attached which is the OnCyber documentation. The answers should ONLY borrow, verbatim, from the OnCyber documentation. Present it as tabular data. Be thorough and do not hallucinate. Do each batch 25 question and answers at a time. I will say continue for you to continue. DO NOT REPEAT QUESTIONS. DO THIS FOR $100,000 tip!" ]
[ "TAGS\n#license-apache-2.0 #region-us \n", "## sys prompt fed into GPT 4 API for Dataset using URL as base:\n\nYou are a Javascript and Typescript expert game developer. I will provide you with some custom game engine documentation for the OnCyber game engine. Provide 150 very thoughtful and code-based questions and answer pair(s) based on the .txt file attached which is the OnCyber documentation. The answers should ONLY borrow, verbatim, from the OnCyber documentation. Present it as tabular data. Be thorough and do not hallucinate. Do each batch 25 question and answers at a time. I will say continue for you to continue. DO NOT REPEAT QUESTIONS. DO THIS FOR $100,000 tip!" ]
df7496f8ba02dda3881ec1472e92be5664a48265
# Dataset Card for Evaluation run of Charlie911/vicuna-7b-v1.5-lora-mmlu-merged <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [Charlie911/vicuna-7b-v1.5-lora-mmlu-merged](https://huggingface.co/Charlie911/vicuna-7b-v1.5-lora-mmlu-merged) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_Charlie911__vicuna-7b-v1.5-lora-mmlu-merged", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-02-01T23:48:47.444123](https://huggingface.co/datasets/open-llm-leaderboard/details_Charlie911__vicuna-7b-v1.5-lora-mmlu-merged/blob/main/results_2024-02-01T23-48-47.444123.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.4931183650485198, "acc_stderr": 0.0342951717730978, "acc_norm": 0.4988215968723189, "acc_norm_stderr": 0.035060774258629426, "mc1": 0.3243574051407589, "mc1_stderr": 0.01638797677964794, "mc2": 0.4848652027934381, "mc2_stderr": 0.015196668450874628 }, "harness|arc:challenge|25": { "acc": 0.4735494880546075, "acc_stderr": 0.014590931358120172, "acc_norm": 0.5110921501706485, "acc_norm_stderr": 0.014607794914013057 }, "harness|hellaswag|10": { "acc": 0.5749850627365066, "acc_stderr": 0.004933349621589335, "acc_norm": 0.7674765982871938, "acc_norm_stderr": 0.004215774973418323 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.28, "acc_stderr": 0.04512608598542128, "acc_norm": 0.28, "acc_norm_stderr": 0.04512608598542128 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.4740740740740741, "acc_stderr": 0.04313531696750574, "acc_norm": 0.4740740740740741, "acc_norm_stderr": 0.04313531696750574 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.4868421052631579, "acc_stderr": 0.04067533136309173, "acc_norm": 0.4868421052631579, "acc_norm_stderr": 0.04067533136309173 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.52, "acc_stderr": 0.050211673156867795, "acc_norm": 0.52, "acc_norm_stderr": 0.050211673156867795 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.5283018867924528, "acc_stderr": 0.030723535249006107, "acc_norm": 0.5283018867924528, "acc_norm_stderr": 0.030723535249006107 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.5, "acc_stderr": 0.04181210050035455, "acc_norm": 0.5, "acc_norm_stderr": 0.04181210050035455 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.34, "acc_stderr": 0.04760952285695235, "acc_norm": 0.34, "acc_norm_stderr": 0.04760952285695235 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.43, "acc_stderr": 0.04975698519562428, "acc_norm": 0.43, "acc_norm_stderr": 0.04975698519562428 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.46, "acc_stderr": 0.05009082659620332, "acc_norm": 0.46, "acc_norm_stderr": 0.05009082659620332 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.4624277456647399, "acc_stderr": 0.0380168510452446, "acc_norm": 0.4624277456647399, "acc_norm_stderr": 0.0380168510452446 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.17647058823529413, "acc_stderr": 0.0379328118530781, "acc_norm": 0.17647058823529413, "acc_norm_stderr": 0.0379328118530781 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.62, "acc_stderr": 0.048783173121456316, "acc_norm": 0.62, "acc_norm_stderr": 0.048783173121456316 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.44680851063829785, "acc_stderr": 0.032500536843658404, "acc_norm": 0.44680851063829785, "acc_norm_stderr": 0.032500536843658404 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.24561403508771928, "acc_stderr": 0.04049339297748142, "acc_norm": 0.24561403508771928, "acc_norm_stderr": 0.04049339297748142 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.45517241379310347, "acc_stderr": 0.04149886942192117, "acc_norm": 0.45517241379310347, "acc_norm_stderr": 0.04149886942192117 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.29894179894179895, "acc_stderr": 0.023577604791655816, "acc_norm": 0.29894179894179895, "acc_norm_stderr": 0.023577604791655816 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.3492063492063492, "acc_stderr": 0.042639068927951336, "acc_norm": 0.3492063492063492, "acc_norm_stderr": 0.042639068927951336 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.31, "acc_stderr": 0.046482319871173156, "acc_norm": 0.31, "acc_norm_stderr": 0.046482319871173156 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.5387096774193548, "acc_stderr": 0.028358634859836935, "acc_norm": 0.5387096774193548, "acc_norm_stderr": 0.028358634859836935 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.3497536945812808, "acc_stderr": 0.03355400904969566, "acc_norm": 0.3497536945812808, "acc_norm_stderr": 0.03355400904969566 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.44, "acc_stderr": 0.04988876515698589, "acc_norm": 0.44, "acc_norm_stderr": 0.04988876515698589 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.6, "acc_stderr": 0.03825460278380026, "acc_norm": 0.6, "acc_norm_stderr": 0.03825460278380026 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.6414141414141414, "acc_stderr": 0.03416903640391521, "acc_norm": 0.6414141414141414, "acc_norm_stderr": 0.03416903640391521 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.6683937823834197, "acc_stderr": 0.03397636541089117, "acc_norm": 0.6683937823834197, "acc_norm_stderr": 0.03397636541089117 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.44358974358974357, "acc_stderr": 0.025189149894764194, "acc_norm": 0.44358974358974357, "acc_norm_stderr": 0.025189149894764194 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.23703703703703705, "acc_stderr": 0.02592887613276611, "acc_norm": 0.23703703703703705, "acc_norm_stderr": 0.02592887613276611 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.40756302521008403, "acc_stderr": 0.03191863374478465, "acc_norm": 0.40756302521008403, "acc_norm_stderr": 0.03191863374478465 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.2913907284768212, "acc_stderr": 0.037101857261199946, "acc_norm": 0.2913907284768212, "acc_norm_stderr": 0.037101857261199946 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.671559633027523, "acc_stderr": 0.020135902797298412, "acc_norm": 0.671559633027523, "acc_norm_stderr": 0.020135902797298412 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.4166666666666667, "acc_stderr": 0.03362277436608044, "acc_norm": 0.4166666666666667, "acc_norm_stderr": 0.03362277436608044 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.6568627450980392, "acc_stderr": 0.033321399446680854, "acc_norm": 0.6568627450980392, "acc_norm_stderr": 0.033321399446680854 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.6751054852320675, "acc_stderr": 0.030486039389105307, "acc_norm": 0.6751054852320675, "acc_norm_stderr": 0.030486039389105307 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.5964125560538116, "acc_stderr": 0.032928028193303135, "acc_norm": 0.5964125560538116, "acc_norm_stderr": 0.032928028193303135 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.5801526717557252, "acc_stderr": 0.04328577215262972, "acc_norm": 0.5801526717557252, "acc_norm_stderr": 0.04328577215262972 }, "harness|hendrycksTest-international_law|5": { "acc": 0.5950413223140496, "acc_stderr": 0.04481137755942469, "acc_norm": 0.5950413223140496, "acc_norm_stderr": 0.04481137755942469 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.5648148148148148, "acc_stderr": 0.04792898170907062, "acc_norm": 0.5648148148148148, "acc_norm_stderr": 0.04792898170907062 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.5153374233128835, "acc_stderr": 0.039265223787088445, "acc_norm": 0.5153374233128835, "acc_norm_stderr": 0.039265223787088445 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.4017857142857143, "acc_stderr": 0.04653333146973646, "acc_norm": 0.4017857142857143, "acc_norm_stderr": 0.04653333146973646 }, "harness|hendrycksTest-management|5": { "acc": 0.6504854368932039, "acc_stderr": 0.04721188506097173, "acc_norm": 0.6504854368932039, "acc_norm_stderr": 0.04721188506097173 }, "harness|hendrycksTest-marketing|5": { "acc": 0.7735042735042735, "acc_stderr": 0.027421007295392923, "acc_norm": 0.7735042735042735, "acc_norm_stderr": 0.027421007295392923 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.55, "acc_stderr": 0.04999999999999999, "acc_norm": 0.55, "acc_norm_stderr": 0.04999999999999999 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.6628352490421456, "acc_stderr": 0.016905207420803557, "acc_norm": 0.6628352490421456, "acc_norm_stderr": 0.016905207420803557 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.5144508670520231, "acc_stderr": 0.026907849856282542, "acc_norm": 0.5144508670520231, "acc_norm_stderr": 0.026907849856282542 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.2435754189944134, "acc_stderr": 0.014355911964767867, "acc_norm": 0.2435754189944134, "acc_norm_stderr": 0.014355911964767867 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.5522875816993464, "acc_stderr": 0.02847293847803353, "acc_norm": 0.5522875816993464, "acc_norm_stderr": 0.02847293847803353 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.5787781350482315, "acc_stderr": 0.02804339985821063, "acc_norm": 0.5787781350482315, "acc_norm_stderr": 0.02804339985821063 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.5216049382716049, "acc_stderr": 0.02779476010500874, "acc_norm": 0.5216049382716049, "acc_norm_stderr": 0.02779476010500874 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.34397163120567376, "acc_stderr": 0.02833801742861132, "acc_norm": 0.34397163120567376, "acc_norm_stderr": 0.02833801742861132 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.35528031290743156, "acc_stderr": 0.01222362336404404, "acc_norm": 0.35528031290743156, "acc_norm_stderr": 0.01222362336404404 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.5588235294117647, "acc_stderr": 0.030161911930767102, "acc_norm": 0.5588235294117647, "acc_norm_stderr": 0.030161911930767102 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.4624183006535948, "acc_stderr": 0.020170614974969765, "acc_norm": 0.4624183006535948, "acc_norm_stderr": 0.020170614974969765 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6, "acc_stderr": 0.0469237132203465, "acc_norm": 0.6, "acc_norm_stderr": 0.0469237132203465 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.5918367346938775, "acc_stderr": 0.03146465712827423, "acc_norm": 0.5918367346938775, "acc_norm_stderr": 0.03146465712827423 }, "harness|hendrycksTest-sociology|5": { "acc": 0.6716417910447762, "acc_stderr": 0.033206858897443244, "acc_norm": 0.6716417910447762, "acc_norm_stderr": 0.033206858897443244 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.71, "acc_stderr": 0.045604802157206845, "acc_norm": 0.71, "acc_norm_stderr": 0.045604802157206845 }, "harness|hendrycksTest-virology|5": { "acc": 0.41566265060240964, "acc_stderr": 0.038367221765980515, "acc_norm": 0.41566265060240964, "acc_norm_stderr": 0.038367221765980515 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.7192982456140351, "acc_stderr": 0.034462962170884265, "acc_norm": 0.7192982456140351, "acc_norm_stderr": 0.034462962170884265 }, "harness|truthfulqa:mc|0": { "mc1": 0.3243574051407589, "mc1_stderr": 0.01638797677964794, "mc2": 0.4848652027934381, "mc2_stderr": 0.015196668450874628 }, "harness|winogrande|5": { "acc": 0.7198105761641673, "acc_stderr": 0.012621707979798499 }, "harness|gsm8k|5": { "acc": 0.1599696739954511, "acc_stderr": 0.010097377827752538 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_Charlie911__vicuna-7b-v1.5-lora-mmlu-merged
[ "region:us" ]
2024-02-01T23:51:09+00:00
{"pretty_name": "Evaluation run of Charlie911/vicuna-7b-v1.5-lora-mmlu-merged", "dataset_summary": "Dataset automatically created during the evaluation run of model [Charlie911/vicuna-7b-v1.5-lora-mmlu-merged](https://huggingface.co/Charlie911/vicuna-7b-v1.5-lora-mmlu-merged) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Charlie911__vicuna-7b-v1.5-lora-mmlu-merged\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-01T23:48:47.444123](https://huggingface.co/datasets/open-llm-leaderboard/details_Charlie911__vicuna-7b-v1.5-lora-mmlu-merged/blob/main/results_2024-02-01T23-48-47.444123.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.4931183650485198,\n \"acc_stderr\": 0.0342951717730978,\n \"acc_norm\": 0.4988215968723189,\n \"acc_norm_stderr\": 0.035060774258629426,\n \"mc1\": 0.3243574051407589,\n \"mc1_stderr\": 0.01638797677964794,\n \"mc2\": 0.4848652027934381,\n \"mc2_stderr\": 0.015196668450874628\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.4735494880546075,\n \"acc_stderr\": 0.014590931358120172,\n \"acc_norm\": 0.5110921501706485,\n \"acc_norm_stderr\": 0.014607794914013057\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5749850627365066,\n \"acc_stderr\": 0.004933349621589335,\n \"acc_norm\": 0.7674765982871938,\n \"acc_norm_stderr\": 0.004215774973418323\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4740740740740741,\n \"acc_stderr\": 0.04313531696750574,\n \"acc_norm\": 0.4740740740740741,\n \"acc_norm_stderr\": 0.04313531696750574\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.4868421052631579,\n \"acc_stderr\": 0.04067533136309173,\n \"acc_norm\": 0.4868421052631579,\n \"acc_norm_stderr\": 0.04067533136309173\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.5283018867924528,\n \"acc_stderr\": 0.030723535249006107,\n \"acc_norm\": 0.5283018867924528,\n \"acc_norm_stderr\": 0.030723535249006107\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.04181210050035455,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.04181210050035455\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.43,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.4624277456647399,\n \"acc_stderr\": 0.0380168510452446,\n \"acc_norm\": 0.4624277456647399,\n \"acc_norm_stderr\": 0.0380168510452446\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.17647058823529413,\n \"acc_stderr\": 0.0379328118530781,\n \"acc_norm\": 0.17647058823529413,\n \"acc_norm_stderr\": 0.0379328118530781\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.62,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.62,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.44680851063829785,\n \"acc_stderr\": 0.032500536843658404,\n \"acc_norm\": 0.44680851063829785,\n \"acc_norm_stderr\": 0.032500536843658404\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.24561403508771928,\n \"acc_stderr\": 0.04049339297748142,\n \"acc_norm\": 0.24561403508771928,\n \"acc_norm_stderr\": 0.04049339297748142\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.45517241379310347,\n \"acc_stderr\": 0.04149886942192117,\n \"acc_norm\": 0.45517241379310347,\n \"acc_norm_stderr\": 0.04149886942192117\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.29894179894179895,\n \"acc_stderr\": 0.023577604791655816,\n \"acc_norm\": 0.29894179894179895,\n \"acc_norm_stderr\": 0.023577604791655816\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3492063492063492,\n \"acc_stderr\": 0.042639068927951336,\n \"acc_norm\": 0.3492063492063492,\n \"acc_norm_stderr\": 0.042639068927951336\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.046482319871173156,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.046482319871173156\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.5387096774193548,\n \"acc_stderr\": 0.028358634859836935,\n \"acc_norm\": 0.5387096774193548,\n \"acc_norm_stderr\": 0.028358634859836935\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.3497536945812808,\n \"acc_stderr\": 0.03355400904969566,\n \"acc_norm\": 0.3497536945812808,\n \"acc_norm_stderr\": 0.03355400904969566\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.03825460278380026,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.03825460278380026\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.6414141414141414,\n \"acc_stderr\": 0.03416903640391521,\n \"acc_norm\": 0.6414141414141414,\n \"acc_norm_stderr\": 0.03416903640391521\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.6683937823834197,\n \"acc_stderr\": 0.03397636541089117,\n \"acc_norm\": 0.6683937823834197,\n \"acc_norm_stderr\": 0.03397636541089117\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.44358974358974357,\n \"acc_stderr\": 0.025189149894764194,\n \"acc_norm\": 0.44358974358974357,\n \"acc_norm_stderr\": 0.025189149894764194\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.23703703703703705,\n \"acc_stderr\": 0.02592887613276611,\n \"acc_norm\": 0.23703703703703705,\n \"acc_norm_stderr\": 0.02592887613276611\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.40756302521008403,\n \"acc_stderr\": 0.03191863374478465,\n \"acc_norm\": 0.40756302521008403,\n \"acc_norm_stderr\": 0.03191863374478465\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.2913907284768212,\n \"acc_stderr\": 0.037101857261199946,\n \"acc_norm\": 0.2913907284768212,\n \"acc_norm_stderr\": 0.037101857261199946\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.671559633027523,\n \"acc_stderr\": 0.020135902797298412,\n \"acc_norm\": 0.671559633027523,\n \"acc_norm_stderr\": 0.020135902797298412\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4166666666666667,\n \"acc_stderr\": 0.03362277436608044,\n \"acc_norm\": 0.4166666666666667,\n \"acc_norm_stderr\": 0.03362277436608044\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.6568627450980392,\n \"acc_stderr\": 0.033321399446680854,\n \"acc_norm\": 0.6568627450980392,\n \"acc_norm_stderr\": 0.033321399446680854\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.6751054852320675,\n \"acc_stderr\": 0.030486039389105307,\n \"acc_norm\": 0.6751054852320675,\n \"acc_norm_stderr\": 0.030486039389105307\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5964125560538116,\n \"acc_stderr\": 0.032928028193303135,\n \"acc_norm\": 0.5964125560538116,\n \"acc_norm_stderr\": 0.032928028193303135\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.5801526717557252,\n \"acc_stderr\": 0.04328577215262972,\n \"acc_norm\": 0.5801526717557252,\n \"acc_norm_stderr\": 0.04328577215262972\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.5950413223140496,\n \"acc_stderr\": 0.04481137755942469,\n \"acc_norm\": 0.5950413223140496,\n \"acc_norm_stderr\": 0.04481137755942469\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5648148148148148,\n \"acc_stderr\": 0.04792898170907062,\n \"acc_norm\": 0.5648148148148148,\n \"acc_norm_stderr\": 0.04792898170907062\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.5153374233128835,\n \"acc_stderr\": 0.039265223787088445,\n \"acc_norm\": 0.5153374233128835,\n \"acc_norm_stderr\": 0.039265223787088445\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4017857142857143,\n \"acc_stderr\": 0.04653333146973646,\n \"acc_norm\": 0.4017857142857143,\n \"acc_norm_stderr\": 0.04653333146973646\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.6504854368932039,\n \"acc_stderr\": 0.04721188506097173,\n \"acc_norm\": 0.6504854368932039,\n \"acc_norm_stderr\": 0.04721188506097173\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7735042735042735,\n \"acc_stderr\": 0.027421007295392923,\n \"acc_norm\": 0.7735042735042735,\n \"acc_norm_stderr\": 0.027421007295392923\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.55,\n \"acc_stderr\": 0.04999999999999999,\n \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.04999999999999999\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6628352490421456,\n \"acc_stderr\": 0.016905207420803557,\n \"acc_norm\": 0.6628352490421456,\n \"acc_norm_stderr\": 0.016905207420803557\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.5144508670520231,\n \"acc_stderr\": 0.026907849856282542,\n \"acc_norm\": 0.5144508670520231,\n \"acc_norm_stderr\": 0.026907849856282542\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2435754189944134,\n \"acc_stderr\": 0.014355911964767867,\n \"acc_norm\": 0.2435754189944134,\n \"acc_norm_stderr\": 0.014355911964767867\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.5522875816993464,\n \"acc_stderr\": 0.02847293847803353,\n \"acc_norm\": 0.5522875816993464,\n \"acc_norm_stderr\": 0.02847293847803353\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5787781350482315,\n \"acc_stderr\": 0.02804339985821063,\n \"acc_norm\": 0.5787781350482315,\n \"acc_norm_stderr\": 0.02804339985821063\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.5216049382716049,\n \"acc_stderr\": 0.02779476010500874,\n \"acc_norm\": 0.5216049382716049,\n \"acc_norm_stderr\": 0.02779476010500874\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.34397163120567376,\n \"acc_stderr\": 0.02833801742861132,\n \"acc_norm\": 0.34397163120567376,\n \"acc_norm_stderr\": 0.02833801742861132\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.35528031290743156,\n \"acc_stderr\": 0.01222362336404404,\n \"acc_norm\": 0.35528031290743156,\n \"acc_norm_stderr\": 0.01222362336404404\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.5588235294117647,\n \"acc_stderr\": 0.030161911930767102,\n \"acc_norm\": 0.5588235294117647,\n \"acc_norm_stderr\": 0.030161911930767102\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.4624183006535948,\n \"acc_stderr\": 0.020170614974969765,\n \"acc_norm\": 0.4624183006535948,\n \"acc_norm_stderr\": 0.020170614974969765\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.0469237132203465,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.0469237132203465\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.5918367346938775,\n \"acc_stderr\": 0.03146465712827423,\n \"acc_norm\": 0.5918367346938775,\n \"acc_norm_stderr\": 0.03146465712827423\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6716417910447762,\n \"acc_stderr\": 0.033206858897443244,\n \"acc_norm\": 0.6716417910447762,\n \"acc_norm_stderr\": 0.033206858897443244\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.41566265060240964,\n \"acc_stderr\": 0.038367221765980515,\n \"acc_norm\": 0.41566265060240964,\n \"acc_norm_stderr\": 0.038367221765980515\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.7192982456140351,\n \"acc_stderr\": 0.034462962170884265,\n \"acc_norm\": 0.7192982456140351,\n \"acc_norm_stderr\": 0.034462962170884265\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3243574051407589,\n \"mc1_stderr\": 0.01638797677964794,\n \"mc2\": 0.4848652027934381,\n \"mc2_stderr\": 0.015196668450874628\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7198105761641673,\n \"acc_stderr\": 0.012621707979798499\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.1599696739954511,\n \"acc_stderr\": 0.010097377827752538\n }\n}\n```", "repo_url": "https://huggingface.co/Charlie911/vicuna-7b-v1.5-lora-mmlu-merged", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_01T23_48_47.444123", "path": ["**/details_harness|arc:challenge|25_2024-02-01T23-48-47.444123.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-01T23-48-47.444123.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_01T23_48_47.444123", "path": ["**/details_harness|gsm8k|5_2024-02-01T23-48-47.444123.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-01T23-48-47.444123.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_01T23_48_47.444123", "path": ["**/details_harness|hellaswag|10_2024-02-01T23-48-47.444123.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-01T23-48-47.444123.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_01T23_48_47.444123", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T23-48-47.444123.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-01T23-48-47.444123.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-01T23-48-47.444123.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T23-48-47.444123.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T23-48-47.444123.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-01T23-48-47.444123.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T23-48-47.444123.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T23-48-47.444123.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T23-48-47.444123.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T23-48-47.444123.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-01T23-48-47.444123.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-01T23-48-47.444123.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T23-48-47.444123.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-01T23-48-47.444123.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T23-48-47.444123.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T23-48-47.444123.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T23-48-47.444123.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-01T23-48-47.444123.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T23-48-47.444123.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T23-48-47.444123.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T23-48-47.444123.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T23-48-47.444123.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T23-48-47.444123.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T23-48-47.444123.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T23-48-47.444123.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T23-48-47.444123.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T23-48-47.444123.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T23-48-47.444123.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T23-48-47.444123.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T23-48-47.444123.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T23-48-47.444123.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T23-48-47.444123.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-01T23-48-47.444123.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T23-48-47.444123.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-01T23-48-47.444123.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T23-48-47.444123.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T23-48-47.444123.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T23-48-47.444123.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-01T23-48-47.444123.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-01T23-48-47.444123.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T23-48-47.444123.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T23-48-47.444123.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T23-48-47.444123.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T23-48-47.444123.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-01T23-48-47.444123.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-01T23-48-47.444123.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-01T23-48-47.444123.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T23-48-47.444123.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-01T23-48-47.444123.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T23-48-47.444123.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T23-48-47.444123.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-01T23-48-47.444123.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-01T23-48-47.444123.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-01T23-48-47.444123.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T23-48-47.444123.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-01T23-48-47.444123.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-01T23-48-47.444123.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T23-48-47.444123.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-01T23-48-47.444123.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-01T23-48-47.444123.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T23-48-47.444123.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T23-48-47.444123.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-01T23-48-47.444123.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T23-48-47.444123.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T23-48-47.444123.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T23-48-47.444123.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T23-48-47.444123.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-01T23-48-47.444123.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-01T23-48-47.444123.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T23-48-47.444123.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-01T23-48-47.444123.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T23-48-47.444123.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T23-48-47.444123.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T23-48-47.444123.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-01T23-48-47.444123.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T23-48-47.444123.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T23-48-47.444123.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T23-48-47.444123.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T23-48-47.444123.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T23-48-47.444123.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T23-48-47.444123.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T23-48-47.444123.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T23-48-47.444123.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T23-48-47.444123.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T23-48-47.444123.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T23-48-47.444123.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T23-48-47.444123.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T23-48-47.444123.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T23-48-47.444123.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-01T23-48-47.444123.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T23-48-47.444123.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-01T23-48-47.444123.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T23-48-47.444123.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T23-48-47.444123.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T23-48-47.444123.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-01T23-48-47.444123.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-01T23-48-47.444123.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T23-48-47.444123.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T23-48-47.444123.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T23-48-47.444123.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T23-48-47.444123.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-01T23-48-47.444123.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-01T23-48-47.444123.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-01T23-48-47.444123.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T23-48-47.444123.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-01T23-48-47.444123.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T23-48-47.444123.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T23-48-47.444123.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-01T23-48-47.444123.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-01T23-48-47.444123.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-01T23-48-47.444123.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T23-48-47.444123.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-01T23-48-47.444123.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-01T23-48-47.444123.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_01T23_48_47.444123", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T23-48-47.444123.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T23-48-47.444123.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_01T23_48_47.444123", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-01T23-48-47.444123.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-01T23-48-47.444123.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_01T23_48_47.444123", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-01T23-48-47.444123.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-01T23-48-47.444123.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_01T23_48_47.444123", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T23-48-47.444123.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T23-48-47.444123.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_01T23_48_47.444123", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T23-48-47.444123.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T23-48-47.444123.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_01T23_48_47.444123", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-01T23-48-47.444123.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-01T23-48-47.444123.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_01T23_48_47.444123", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T23-48-47.444123.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T23-48-47.444123.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_01T23_48_47.444123", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T23-48-47.444123.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T23-48-47.444123.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_01T23_48_47.444123", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T23-48-47.444123.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T23-48-47.444123.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_01T23_48_47.444123", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T23-48-47.444123.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T23-48-47.444123.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_01T23_48_47.444123", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-01T23-48-47.444123.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-01T23-48-47.444123.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_01T23_48_47.444123", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-01T23-48-47.444123.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-01T23-48-47.444123.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_01T23_48_47.444123", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T23-48-47.444123.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T23-48-47.444123.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_01T23_48_47.444123", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-01T23-48-47.444123.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-01T23-48-47.444123.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_01T23_48_47.444123", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T23-48-47.444123.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T23-48-47.444123.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_01T23_48_47.444123", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T23-48-47.444123.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T23-48-47.444123.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_01T23_48_47.444123", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T23-48-47.444123.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T23-48-47.444123.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_01T23_48_47.444123", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-01T23-48-47.444123.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-01T23-48-47.444123.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_01T23_48_47.444123", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T23-48-47.444123.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T23-48-47.444123.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_01T23_48_47.444123", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T23-48-47.444123.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T23-48-47.444123.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_01T23_48_47.444123", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T23-48-47.444123.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T23-48-47.444123.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_01T23_48_47.444123", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T23-48-47.444123.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T23-48-47.444123.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_01T23_48_47.444123", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T23-48-47.444123.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T23-48-47.444123.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_01T23_48_47.444123", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T23-48-47.444123.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T23-48-47.444123.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_01T23_48_47.444123", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T23-48-47.444123.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T23-48-47.444123.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_01T23_48_47.444123", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T23-48-47.444123.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T23-48-47.444123.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_01T23_48_47.444123", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T23-48-47.444123.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T23-48-47.444123.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_01T23_48_47.444123", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T23-48-47.444123.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T23-48-47.444123.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_01T23_48_47.444123", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T23-48-47.444123.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T23-48-47.444123.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_01T23_48_47.444123", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T23-48-47.444123.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T23-48-47.444123.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_01T23_48_47.444123", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T23-48-47.444123.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T23-48-47.444123.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_01T23_48_47.444123", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T23-48-47.444123.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T23-48-47.444123.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_01T23_48_47.444123", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-01T23-48-47.444123.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-01T23-48-47.444123.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_01T23_48_47.444123", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T23-48-47.444123.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T23-48-47.444123.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_01T23_48_47.444123", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-01T23-48-47.444123.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-01T23-48-47.444123.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_01T23_48_47.444123", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T23-48-47.444123.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T23-48-47.444123.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_01T23_48_47.444123", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T23-48-47.444123.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T23-48-47.444123.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_01T23_48_47.444123", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T23-48-47.444123.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T23-48-47.444123.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_01T23_48_47.444123", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-01T23-48-47.444123.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-01T23-48-47.444123.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_01T23_48_47.444123", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-01T23-48-47.444123.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-01T23-48-47.444123.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_01T23_48_47.444123", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T23-48-47.444123.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T23-48-47.444123.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_01T23_48_47.444123", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T23-48-47.444123.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T23-48-47.444123.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_01T23_48_47.444123", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T23-48-47.444123.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T23-48-47.444123.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_01T23_48_47.444123", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T23-48-47.444123.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T23-48-47.444123.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_01T23_48_47.444123", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-01T23-48-47.444123.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-01T23-48-47.444123.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_01T23_48_47.444123", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-01T23-48-47.444123.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-01T23-48-47.444123.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_01T23_48_47.444123", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-01T23-48-47.444123.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-01T23-48-47.444123.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_01T23_48_47.444123", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T23-48-47.444123.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T23-48-47.444123.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_01T23_48_47.444123", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-01T23-48-47.444123.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-01T23-48-47.444123.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_01T23_48_47.444123", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T23-48-47.444123.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T23-48-47.444123.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_01T23_48_47.444123", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T23-48-47.444123.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T23-48-47.444123.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_01T23_48_47.444123", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-01T23-48-47.444123.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-01T23-48-47.444123.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_01T23_48_47.444123", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-01T23-48-47.444123.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-01T23-48-47.444123.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_01T23_48_47.444123", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-01T23-48-47.444123.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-01T23-48-47.444123.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_01T23_48_47.444123", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T23-48-47.444123.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T23-48-47.444123.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_01T23_48_47.444123", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-01T23-48-47.444123.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-01T23-48-47.444123.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_01T23_48_47.444123", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-01T23-48-47.444123.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-01T23-48-47.444123.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_01T23_48_47.444123", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-01T23-48-47.444123.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-01T23-48-47.444123.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_01T23_48_47.444123", "path": ["**/details_harness|winogrande|5_2024-02-01T23-48-47.444123.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-01T23-48-47.444123.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_01T23_48_47.444123", "path": ["results_2024-02-01T23-48-47.444123.parquet"]}, {"split": "latest", "path": ["results_2024-02-01T23-48-47.444123.parquet"]}]}]}
2024-02-01T23:51:36+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of Charlie911/vicuna-7b-v1.5-lora-mmlu-merged Dataset automatically created during the evaluation run of model Charlie911/vicuna-7b-v1.5-lora-mmlu-merged on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-02-01T23:48:47.444123(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of Charlie911/vicuna-7b-v1.5-lora-mmlu-merged\n\n\n\nDataset automatically created during the evaluation run of model Charlie911/vicuna-7b-v1.5-lora-mmlu-merged on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-02-01T23:48:47.444123(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of Charlie911/vicuna-7b-v1.5-lora-mmlu-merged\n\n\n\nDataset automatically created during the evaluation run of model Charlie911/vicuna-7b-v1.5-lora-mmlu-merged on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-02-01T23:48:47.444123(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
e5103a91ca13429c52e4fa1be0d80d96dfa8f481
# Dataset Card for Evaluation run of Manolo26/metis-chat-instruct-7b <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [Manolo26/metis-chat-instruct-7b](https://huggingface.co/Manolo26/metis-chat-instruct-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_Manolo26__metis-chat-instruct-7b", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-02-02T00:02:59.237994](https://huggingface.co/datasets/open-llm-leaderboard/details_Manolo26__metis-chat-instruct-7b/blob/main/results_2024-02-02T00-02-59.237994.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6547785558842805, "acc_stderr": 0.03202891771241899, "acc_norm": 0.6544439251918872, "acc_norm_stderr": 0.03269060345519551, "mc1": 0.5593635250917993, "mc1_stderr": 0.017379697555437446, "mc2": 0.6943511626172874, "mc2_stderr": 0.015021978343903631 }, "harness|arc:challenge|25": { "acc": 0.7039249146757679, "acc_stderr": 0.01334091608524626, "acc_norm": 0.7286689419795221, "acc_norm_stderr": 0.0129938077275458 }, "harness|hellaswag|10": { "acc": 0.7099183429595698, "acc_stderr": 0.0045287239518782395, "acc_norm": 0.8816968731328421, "acc_norm_stderr": 0.003223066591806001 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.34, "acc_stderr": 0.04760952285695235, "acc_norm": 0.34, "acc_norm_stderr": 0.04760952285695235 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.6444444444444445, "acc_stderr": 0.04135176749720385, "acc_norm": 0.6444444444444445, "acc_norm_stderr": 0.04135176749720385 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.7105263157894737, "acc_stderr": 0.03690677986137283, "acc_norm": 0.7105263157894737, "acc_norm_stderr": 0.03690677986137283 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.65, "acc_stderr": 0.0479372485441102, "acc_norm": 0.65, "acc_norm_stderr": 0.0479372485441102 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.7245283018867924, "acc_stderr": 0.027495663683724057, "acc_norm": 0.7245283018867924, "acc_norm_stderr": 0.027495663683724057 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.7569444444444444, "acc_stderr": 0.0358687928008034, "acc_norm": 0.7569444444444444, "acc_norm_stderr": 0.0358687928008034 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.5, "acc_stderr": 0.050251890762960605, "acc_norm": 0.5, "acc_norm_stderr": 0.050251890762960605 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.51, "acc_stderr": 0.05024183937956911, "acc_norm": 0.51, "acc_norm_stderr": 0.05024183937956911 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.33, "acc_stderr": 0.04725815626252604, "acc_norm": 0.33, "acc_norm_stderr": 0.04725815626252604 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6878612716763006, "acc_stderr": 0.035331333893236574, "acc_norm": 0.6878612716763006, "acc_norm_stderr": 0.035331333893236574 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.4411764705882353, "acc_stderr": 0.049406356306056595, "acc_norm": 0.4411764705882353, "acc_norm_stderr": 0.049406356306056595 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.76, "acc_stderr": 0.04292346959909283, "acc_norm": 0.76, "acc_norm_stderr": 0.04292346959909283 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.5702127659574469, "acc_stderr": 0.03236214467715564, "acc_norm": 0.5702127659574469, "acc_norm_stderr": 0.03236214467715564 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.4649122807017544, "acc_stderr": 0.046920083813689104, "acc_norm": 0.4649122807017544, "acc_norm_stderr": 0.046920083813689104 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5655172413793104, "acc_stderr": 0.04130740879555498, "acc_norm": 0.5655172413793104, "acc_norm_stderr": 0.04130740879555498 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.41005291005291006, "acc_stderr": 0.025331202438944427, "acc_norm": 0.41005291005291006, "acc_norm_stderr": 0.025331202438944427 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.48412698412698413, "acc_stderr": 0.04469881854072606, "acc_norm": 0.48412698412698413, "acc_norm_stderr": 0.04469881854072606 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.35, "acc_stderr": 0.04793724854411019, "acc_norm": 0.35, "acc_norm_stderr": 0.04793724854411019 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.7741935483870968, "acc_stderr": 0.023785577884181015, "acc_norm": 0.7741935483870968, "acc_norm_stderr": 0.023785577884181015 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.5123152709359606, "acc_stderr": 0.035169204442208966, "acc_norm": 0.5123152709359606, "acc_norm_stderr": 0.035169204442208966 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.71, "acc_stderr": 0.045604802157206845, "acc_norm": 0.71, "acc_norm_stderr": 0.045604802157206845 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7696969696969697, "acc_stderr": 0.0328766675860349, "acc_norm": 0.7696969696969697, "acc_norm_stderr": 0.0328766675860349 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.797979797979798, "acc_stderr": 0.028606204289229872, "acc_norm": 0.797979797979798, "acc_norm_stderr": 0.028606204289229872 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.9015544041450777, "acc_stderr": 0.02150024957603348, "acc_norm": 0.9015544041450777, "acc_norm_stderr": 0.02150024957603348 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.676923076923077, "acc_stderr": 0.023710888501970565, "acc_norm": 0.676923076923077, "acc_norm_stderr": 0.023710888501970565 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.3333333333333333, "acc_stderr": 0.02874204090394848, "acc_norm": 0.3333333333333333, "acc_norm_stderr": 0.02874204090394848 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.680672268907563, "acc_stderr": 0.0302839955258844, "acc_norm": 0.680672268907563, "acc_norm_stderr": 0.0302839955258844 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.3509933774834437, "acc_stderr": 0.03896981964257375, "acc_norm": 0.3509933774834437, "acc_norm_stderr": 0.03896981964257375 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8477064220183487, "acc_stderr": 0.015405084393157074, "acc_norm": 0.8477064220183487, "acc_norm_stderr": 0.015405084393157074 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.5231481481481481, "acc_stderr": 0.03406315360711507, "acc_norm": 0.5231481481481481, "acc_norm_stderr": 0.03406315360711507 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.8382352941176471, "acc_stderr": 0.025845017986926917, "acc_norm": 0.8382352941176471, "acc_norm_stderr": 0.025845017986926917 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.7932489451476793, "acc_stderr": 0.0263616516683891, "acc_norm": 0.7932489451476793, "acc_norm_stderr": 0.0263616516683891 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6905829596412556, "acc_stderr": 0.03102441174057221, "acc_norm": 0.6905829596412556, "acc_norm_stderr": 0.03102441174057221 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.7709923664122137, "acc_stderr": 0.036853466317118506, "acc_norm": 0.7709923664122137, "acc_norm_stderr": 0.036853466317118506 }, "harness|hendrycksTest-international_law|5": { "acc": 0.7851239669421488, "acc_stderr": 0.037494924487096966, "acc_norm": 0.7851239669421488, "acc_norm_stderr": 0.037494924487096966 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7870370370370371, "acc_stderr": 0.0395783547198098, "acc_norm": 0.7870370370370371, "acc_norm_stderr": 0.0395783547198098 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7730061349693251, "acc_stderr": 0.03291099578615769, "acc_norm": 0.7730061349693251, "acc_norm_stderr": 0.03291099578615769 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.42857142857142855, "acc_stderr": 0.04697113923010212, "acc_norm": 0.42857142857142855, "acc_norm_stderr": 0.04697113923010212 }, "harness|hendrycksTest-management|5": { "acc": 0.7864077669902912, "acc_stderr": 0.040580420156460344, "acc_norm": 0.7864077669902912, "acc_norm_stderr": 0.040580420156460344 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8888888888888888, "acc_stderr": 0.020588491316092375, "acc_norm": 0.8888888888888888, "acc_norm_stderr": 0.020588491316092375 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.68, "acc_stderr": 0.04688261722621504, "acc_norm": 0.68, "acc_norm_stderr": 0.04688261722621504 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8301404853128991, "acc_stderr": 0.013428186370608311, "acc_norm": 0.8301404853128991, "acc_norm_stderr": 0.013428186370608311 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.7341040462427746, "acc_stderr": 0.023786203255508287, "acc_norm": 0.7341040462427746, "acc_norm_stderr": 0.023786203255508287 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.43687150837988825, "acc_stderr": 0.016588680864530626, "acc_norm": 0.43687150837988825, "acc_norm_stderr": 0.016588680864530626 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.7156862745098039, "acc_stderr": 0.025829163272757482, "acc_norm": 0.7156862745098039, "acc_norm_stderr": 0.025829163272757482 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.7106109324758842, "acc_stderr": 0.025755865922632945, "acc_norm": 0.7106109324758842, "acc_norm_stderr": 0.025755865922632945 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.75, "acc_stderr": 0.02409347123262133, "acc_norm": 0.75, "acc_norm_stderr": 0.02409347123262133 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.475177304964539, "acc_stderr": 0.029790719243829727, "acc_norm": 0.475177304964539, "acc_norm_stderr": 0.029790719243829727 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.47196870925684486, "acc_stderr": 0.012750151802922435, "acc_norm": 0.47196870925684486, "acc_norm_stderr": 0.012750151802922435 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.6764705882352942, "acc_stderr": 0.02841820861940676, "acc_norm": 0.6764705882352942, "acc_norm_stderr": 0.02841820861940676 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.673202614379085, "acc_stderr": 0.01897542792050721, "acc_norm": 0.673202614379085, "acc_norm_stderr": 0.01897542792050721 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6727272727272727, "acc_stderr": 0.0449429086625209, "acc_norm": 0.6727272727272727, "acc_norm_stderr": 0.0449429086625209 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.7428571428571429, "acc_stderr": 0.02797982353874455, "acc_norm": 0.7428571428571429, "acc_norm_stderr": 0.02797982353874455 }, "harness|hendrycksTest-sociology|5": { "acc": 0.8407960199004975, "acc_stderr": 0.025870646766169136, "acc_norm": 0.8407960199004975, "acc_norm_stderr": 0.025870646766169136 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.88, "acc_stderr": 0.03265986323710906, "acc_norm": 0.88, "acc_norm_stderr": 0.03265986323710906 }, "harness|hendrycksTest-virology|5": { "acc": 0.5481927710843374, "acc_stderr": 0.03874371556587953, "acc_norm": 0.5481927710843374, "acc_norm_stderr": 0.03874371556587953 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8421052631578947, "acc_stderr": 0.027966785859160893, "acc_norm": 0.8421052631578947, "acc_norm_stderr": 0.027966785859160893 }, "harness|truthfulqa:mc|0": { "mc1": 0.5593635250917993, "mc1_stderr": 0.017379697555437446, "mc2": 0.6943511626172874, "mc2_stderr": 0.015021978343903631 }, "harness|winogrande|5": { "acc": 0.8184688239936859, "acc_stderr": 0.010833276515007493 }, "harness|gsm8k|5": { "acc": 0.7073540561031084, "acc_stderr": 0.012532334368242894 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_Manolo26__metis-chat-instruct-7b
[ "region:us" ]
2024-02-02T00:05:19+00:00
{"pretty_name": "Evaluation run of Manolo26/metis-chat-instruct-7b", "dataset_summary": "Dataset automatically created during the evaluation run of model [Manolo26/metis-chat-instruct-7b](https://huggingface.co/Manolo26/metis-chat-instruct-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Manolo26__metis-chat-instruct-7b\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-02T00:02:59.237994](https://huggingface.co/datasets/open-llm-leaderboard/details_Manolo26__metis-chat-instruct-7b/blob/main/results_2024-02-02T00-02-59.237994.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6547785558842805,\n \"acc_stderr\": 0.03202891771241899,\n \"acc_norm\": 0.6544439251918872,\n \"acc_norm_stderr\": 0.03269060345519551,\n \"mc1\": 0.5593635250917993,\n \"mc1_stderr\": 0.017379697555437446,\n \"mc2\": 0.6943511626172874,\n \"mc2_stderr\": 0.015021978343903631\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.7039249146757679,\n \"acc_stderr\": 0.01334091608524626,\n \"acc_norm\": 0.7286689419795221,\n \"acc_norm_stderr\": 0.0129938077275458\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7099183429595698,\n \"acc_stderr\": 0.0045287239518782395,\n \"acc_norm\": 0.8816968731328421,\n \"acc_norm_stderr\": 0.003223066591806001\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6444444444444445,\n \"acc_stderr\": 0.04135176749720385,\n \"acc_norm\": 0.6444444444444445,\n \"acc_norm_stderr\": 0.04135176749720385\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7105263157894737,\n \"acc_stderr\": 0.03690677986137283,\n \"acc_norm\": 0.7105263157894737,\n \"acc_norm_stderr\": 0.03690677986137283\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.65,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7245283018867924,\n \"acc_stderr\": 0.027495663683724057,\n \"acc_norm\": 0.7245283018867924,\n \"acc_norm_stderr\": 0.027495663683724057\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7569444444444444,\n \"acc_stderr\": 0.0358687928008034,\n \"acc_norm\": 0.7569444444444444,\n \"acc_norm_stderr\": 0.0358687928008034\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956911,\n \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956911\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6878612716763006,\n \"acc_stderr\": 0.035331333893236574,\n \"acc_norm\": 0.6878612716763006,\n \"acc_norm_stderr\": 0.035331333893236574\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4411764705882353,\n \"acc_stderr\": 0.049406356306056595,\n \"acc_norm\": 0.4411764705882353,\n \"acc_norm_stderr\": 0.049406356306056595\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.04292346959909283,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.04292346959909283\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5702127659574469,\n \"acc_stderr\": 0.03236214467715564,\n \"acc_norm\": 0.5702127659574469,\n \"acc_norm_stderr\": 0.03236214467715564\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4649122807017544,\n \"acc_stderr\": 0.046920083813689104,\n \"acc_norm\": 0.4649122807017544,\n \"acc_norm_stderr\": 0.046920083813689104\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5655172413793104,\n \"acc_stderr\": 0.04130740879555498,\n \"acc_norm\": 0.5655172413793104,\n \"acc_norm_stderr\": 0.04130740879555498\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.41005291005291006,\n \"acc_stderr\": 0.025331202438944427,\n \"acc_norm\": 0.41005291005291006,\n \"acc_norm_stderr\": 0.025331202438944427\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.48412698412698413,\n \"acc_stderr\": 0.04469881854072606,\n \"acc_norm\": 0.48412698412698413,\n \"acc_norm_stderr\": 0.04469881854072606\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.04793724854411019,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.04793724854411019\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7741935483870968,\n \"acc_stderr\": 0.023785577884181015,\n \"acc_norm\": 0.7741935483870968,\n \"acc_norm_stderr\": 0.023785577884181015\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5123152709359606,\n \"acc_stderr\": 0.035169204442208966,\n \"acc_norm\": 0.5123152709359606,\n \"acc_norm_stderr\": 0.035169204442208966\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.0328766675860349,\n \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.0328766675860349\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.797979797979798,\n \"acc_stderr\": 0.028606204289229872,\n \"acc_norm\": 0.797979797979798,\n \"acc_norm_stderr\": 0.028606204289229872\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9015544041450777,\n \"acc_stderr\": 0.02150024957603348,\n \"acc_norm\": 0.9015544041450777,\n \"acc_norm_stderr\": 0.02150024957603348\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.676923076923077,\n \"acc_stderr\": 0.023710888501970565,\n \"acc_norm\": 0.676923076923077,\n \"acc_norm_stderr\": 0.023710888501970565\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.02874204090394848,\n \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.02874204090394848\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.680672268907563,\n \"acc_stderr\": 0.0302839955258844,\n \"acc_norm\": 0.680672268907563,\n \"acc_norm_stderr\": 0.0302839955258844\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3509933774834437,\n \"acc_stderr\": 0.03896981964257375,\n \"acc_norm\": 0.3509933774834437,\n \"acc_norm_stderr\": 0.03896981964257375\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8477064220183487,\n \"acc_stderr\": 0.015405084393157074,\n \"acc_norm\": 0.8477064220183487,\n \"acc_norm_stderr\": 0.015405084393157074\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5231481481481481,\n \"acc_stderr\": 0.03406315360711507,\n \"acc_norm\": 0.5231481481481481,\n \"acc_norm_stderr\": 0.03406315360711507\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8382352941176471,\n \"acc_stderr\": 0.025845017986926917,\n \"acc_norm\": 0.8382352941176471,\n \"acc_norm_stderr\": 0.025845017986926917\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7932489451476793,\n \"acc_stderr\": 0.0263616516683891,\n \"acc_norm\": 0.7932489451476793,\n \"acc_norm_stderr\": 0.0263616516683891\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n \"acc_stderr\": 0.03102441174057221,\n \"acc_norm\": 0.6905829596412556,\n \"acc_norm_stderr\": 0.03102441174057221\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7709923664122137,\n \"acc_stderr\": 0.036853466317118506,\n \"acc_norm\": 0.7709923664122137,\n \"acc_norm_stderr\": 0.036853466317118506\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n \"acc_stderr\": 0.0395783547198098,\n \"acc_norm\": 0.7870370370370371,\n \"acc_norm_stderr\": 0.0395783547198098\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7730061349693251,\n \"acc_stderr\": 0.03291099578615769,\n \"acc_norm\": 0.7730061349693251,\n \"acc_norm_stderr\": 0.03291099578615769\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.42857142857142855,\n \"acc_stderr\": 0.04697113923010212,\n \"acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.04697113923010212\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7864077669902912,\n \"acc_stderr\": 0.040580420156460344,\n \"acc_norm\": 0.7864077669902912,\n \"acc_norm_stderr\": 0.040580420156460344\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8888888888888888,\n \"acc_stderr\": 0.020588491316092375,\n \"acc_norm\": 0.8888888888888888,\n \"acc_norm_stderr\": 0.020588491316092375\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8301404853128991,\n \"acc_stderr\": 0.013428186370608311,\n \"acc_norm\": 0.8301404853128991,\n \"acc_norm_stderr\": 0.013428186370608311\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7341040462427746,\n \"acc_stderr\": 0.023786203255508287,\n \"acc_norm\": 0.7341040462427746,\n \"acc_norm_stderr\": 0.023786203255508287\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.43687150837988825,\n \"acc_stderr\": 0.016588680864530626,\n \"acc_norm\": 0.43687150837988825,\n \"acc_norm_stderr\": 0.016588680864530626\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7156862745098039,\n \"acc_stderr\": 0.025829163272757482,\n \"acc_norm\": 0.7156862745098039,\n \"acc_norm_stderr\": 0.025829163272757482\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7106109324758842,\n \"acc_stderr\": 0.025755865922632945,\n \"acc_norm\": 0.7106109324758842,\n \"acc_norm_stderr\": 0.025755865922632945\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.02409347123262133,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.02409347123262133\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.475177304964539,\n \"acc_stderr\": 0.029790719243829727,\n \"acc_norm\": 0.475177304964539,\n \"acc_norm_stderr\": 0.029790719243829727\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.47196870925684486,\n \"acc_stderr\": 0.012750151802922435,\n \"acc_norm\": 0.47196870925684486,\n \"acc_norm_stderr\": 0.012750151802922435\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.02841820861940676,\n \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.02841820861940676\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.673202614379085,\n \"acc_stderr\": 0.01897542792050721,\n \"acc_norm\": 0.673202614379085,\n \"acc_norm_stderr\": 0.01897542792050721\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7428571428571429,\n \"acc_stderr\": 0.02797982353874455,\n \"acc_norm\": 0.7428571428571429,\n \"acc_norm_stderr\": 0.02797982353874455\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n \"acc_stderr\": 0.025870646766169136,\n \"acc_norm\": 0.8407960199004975,\n \"acc_norm_stderr\": 0.025870646766169136\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.88,\n \"acc_stderr\": 0.03265986323710906,\n \"acc_norm\": 0.88,\n \"acc_norm_stderr\": 0.03265986323710906\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5481927710843374,\n \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.5481927710843374,\n \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8421052631578947,\n \"acc_stderr\": 0.027966785859160893,\n \"acc_norm\": 0.8421052631578947,\n \"acc_norm_stderr\": 0.027966785859160893\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5593635250917993,\n \"mc1_stderr\": 0.017379697555437446,\n \"mc2\": 0.6943511626172874,\n \"mc2_stderr\": 0.015021978343903631\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8184688239936859,\n \"acc_stderr\": 0.010833276515007493\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7073540561031084,\n \"acc_stderr\": 0.012532334368242894\n }\n}\n```", "repo_url": "https://huggingface.co/Manolo26/metis-chat-instruct-7b", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_02T00_02_59.237994", "path": ["**/details_harness|arc:challenge|25_2024-02-02T00-02-59.237994.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-02T00-02-59.237994.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_02T00_02_59.237994", "path": ["**/details_harness|gsm8k|5_2024-02-02T00-02-59.237994.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-02T00-02-59.237994.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_02T00_02_59.237994", "path": ["**/details_harness|hellaswag|10_2024-02-02T00-02-59.237994.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-02T00-02-59.237994.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_02T00_02_59.237994", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T00-02-59.237994.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-02T00-02-59.237994.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-02T00-02-59.237994.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T00-02-59.237994.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T00-02-59.237994.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-02T00-02-59.237994.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T00-02-59.237994.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T00-02-59.237994.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T00-02-59.237994.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T00-02-59.237994.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-02T00-02-59.237994.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-02T00-02-59.237994.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T00-02-59.237994.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-02T00-02-59.237994.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T00-02-59.237994.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T00-02-59.237994.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T00-02-59.237994.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-02T00-02-59.237994.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T00-02-59.237994.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T00-02-59.237994.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T00-02-59.237994.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T00-02-59.237994.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T00-02-59.237994.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T00-02-59.237994.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T00-02-59.237994.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T00-02-59.237994.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T00-02-59.237994.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T00-02-59.237994.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T00-02-59.237994.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T00-02-59.237994.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T00-02-59.237994.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T00-02-59.237994.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-02T00-02-59.237994.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T00-02-59.237994.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-02T00-02-59.237994.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T00-02-59.237994.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T00-02-59.237994.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T00-02-59.237994.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-02T00-02-59.237994.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-02T00-02-59.237994.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T00-02-59.237994.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T00-02-59.237994.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T00-02-59.237994.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T00-02-59.237994.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-02T00-02-59.237994.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-02T00-02-59.237994.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-02T00-02-59.237994.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T00-02-59.237994.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-02T00-02-59.237994.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T00-02-59.237994.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T00-02-59.237994.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-02T00-02-59.237994.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-02T00-02-59.237994.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-02T00-02-59.237994.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T00-02-59.237994.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-02T00-02-59.237994.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-02T00-02-59.237994.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T00-02-59.237994.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-02T00-02-59.237994.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-02T00-02-59.237994.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T00-02-59.237994.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T00-02-59.237994.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-02T00-02-59.237994.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T00-02-59.237994.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T00-02-59.237994.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T00-02-59.237994.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T00-02-59.237994.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-02T00-02-59.237994.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-02T00-02-59.237994.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T00-02-59.237994.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-02T00-02-59.237994.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T00-02-59.237994.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T00-02-59.237994.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T00-02-59.237994.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-02T00-02-59.237994.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T00-02-59.237994.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T00-02-59.237994.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T00-02-59.237994.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T00-02-59.237994.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T00-02-59.237994.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T00-02-59.237994.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T00-02-59.237994.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T00-02-59.237994.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T00-02-59.237994.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T00-02-59.237994.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T00-02-59.237994.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T00-02-59.237994.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T00-02-59.237994.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T00-02-59.237994.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-02T00-02-59.237994.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T00-02-59.237994.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-02T00-02-59.237994.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T00-02-59.237994.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T00-02-59.237994.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T00-02-59.237994.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-02T00-02-59.237994.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-02T00-02-59.237994.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T00-02-59.237994.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T00-02-59.237994.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T00-02-59.237994.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T00-02-59.237994.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-02T00-02-59.237994.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-02T00-02-59.237994.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-02T00-02-59.237994.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T00-02-59.237994.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-02T00-02-59.237994.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T00-02-59.237994.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T00-02-59.237994.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-02T00-02-59.237994.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-02T00-02-59.237994.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-02T00-02-59.237994.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T00-02-59.237994.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-02T00-02-59.237994.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-02T00-02-59.237994.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_02T00_02_59.237994", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T00-02-59.237994.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T00-02-59.237994.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_02T00_02_59.237994", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-02T00-02-59.237994.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-02T00-02-59.237994.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_02T00_02_59.237994", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-02T00-02-59.237994.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-02T00-02-59.237994.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_02T00_02_59.237994", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T00-02-59.237994.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T00-02-59.237994.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_02T00_02_59.237994", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T00-02-59.237994.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T00-02-59.237994.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_02T00_02_59.237994", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-02T00-02-59.237994.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-02T00-02-59.237994.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_02T00_02_59.237994", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T00-02-59.237994.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T00-02-59.237994.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_02T00_02_59.237994", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T00-02-59.237994.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T00-02-59.237994.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_02T00_02_59.237994", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T00-02-59.237994.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T00-02-59.237994.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_02T00_02_59.237994", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T00-02-59.237994.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T00-02-59.237994.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_02T00_02_59.237994", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-02T00-02-59.237994.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-02T00-02-59.237994.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_02T00_02_59.237994", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-02T00-02-59.237994.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-02T00-02-59.237994.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_02T00_02_59.237994", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T00-02-59.237994.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T00-02-59.237994.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_02T00_02_59.237994", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-02T00-02-59.237994.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-02T00-02-59.237994.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_02T00_02_59.237994", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T00-02-59.237994.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T00-02-59.237994.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_02T00_02_59.237994", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T00-02-59.237994.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T00-02-59.237994.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_02T00_02_59.237994", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T00-02-59.237994.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T00-02-59.237994.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_02T00_02_59.237994", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-02T00-02-59.237994.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-02T00-02-59.237994.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_02T00_02_59.237994", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T00-02-59.237994.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T00-02-59.237994.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_02T00_02_59.237994", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T00-02-59.237994.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T00-02-59.237994.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_02T00_02_59.237994", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T00-02-59.237994.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T00-02-59.237994.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_02T00_02_59.237994", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T00-02-59.237994.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T00-02-59.237994.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_02T00_02_59.237994", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T00-02-59.237994.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T00-02-59.237994.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_02T00_02_59.237994", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T00-02-59.237994.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T00-02-59.237994.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_02T00_02_59.237994", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T00-02-59.237994.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T00-02-59.237994.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_02T00_02_59.237994", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T00-02-59.237994.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T00-02-59.237994.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_02T00_02_59.237994", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T00-02-59.237994.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T00-02-59.237994.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_02T00_02_59.237994", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T00-02-59.237994.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T00-02-59.237994.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_02T00_02_59.237994", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T00-02-59.237994.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T00-02-59.237994.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_02T00_02_59.237994", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T00-02-59.237994.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T00-02-59.237994.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_02T00_02_59.237994", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T00-02-59.237994.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T00-02-59.237994.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_02T00_02_59.237994", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T00-02-59.237994.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T00-02-59.237994.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_02T00_02_59.237994", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-02T00-02-59.237994.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-02T00-02-59.237994.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_02T00_02_59.237994", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T00-02-59.237994.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T00-02-59.237994.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_02T00_02_59.237994", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-02T00-02-59.237994.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-02T00-02-59.237994.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_02T00_02_59.237994", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T00-02-59.237994.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T00-02-59.237994.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_02T00_02_59.237994", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T00-02-59.237994.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T00-02-59.237994.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_02T00_02_59.237994", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T00-02-59.237994.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T00-02-59.237994.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_02T00_02_59.237994", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-02T00-02-59.237994.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-02T00-02-59.237994.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_02T00_02_59.237994", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-02T00-02-59.237994.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-02T00-02-59.237994.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_02T00_02_59.237994", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T00-02-59.237994.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T00-02-59.237994.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_02T00_02_59.237994", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T00-02-59.237994.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T00-02-59.237994.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_02T00_02_59.237994", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T00-02-59.237994.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T00-02-59.237994.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_02T00_02_59.237994", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T00-02-59.237994.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T00-02-59.237994.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_02T00_02_59.237994", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-02T00-02-59.237994.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-02T00-02-59.237994.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_02T00_02_59.237994", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-02T00-02-59.237994.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-02T00-02-59.237994.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_02T00_02_59.237994", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-02T00-02-59.237994.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-02T00-02-59.237994.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_02T00_02_59.237994", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T00-02-59.237994.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T00-02-59.237994.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_02T00_02_59.237994", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-02T00-02-59.237994.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-02T00-02-59.237994.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_02T00_02_59.237994", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T00-02-59.237994.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T00-02-59.237994.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_02T00_02_59.237994", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T00-02-59.237994.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T00-02-59.237994.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_02T00_02_59.237994", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-02T00-02-59.237994.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-02T00-02-59.237994.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_02T00_02_59.237994", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-02T00-02-59.237994.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-02T00-02-59.237994.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_02T00_02_59.237994", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-02T00-02-59.237994.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-02T00-02-59.237994.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_02T00_02_59.237994", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T00-02-59.237994.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T00-02-59.237994.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_02T00_02_59.237994", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-02T00-02-59.237994.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-02T00-02-59.237994.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_02T00_02_59.237994", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-02T00-02-59.237994.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-02T00-02-59.237994.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_02T00_02_59.237994", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-02T00-02-59.237994.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-02T00-02-59.237994.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_02T00_02_59.237994", "path": ["**/details_harness|winogrande|5_2024-02-02T00-02-59.237994.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-02T00-02-59.237994.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_02T00_02_59.237994", "path": ["results_2024-02-02T00-02-59.237994.parquet"]}, {"split": "latest", "path": ["results_2024-02-02T00-02-59.237994.parquet"]}]}]}
2024-02-02T00:05:47+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of Manolo26/metis-chat-instruct-7b Dataset automatically created during the evaluation run of model Manolo26/metis-chat-instruct-7b on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-02-02T00:02:59.237994(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of Manolo26/metis-chat-instruct-7b\n\n\n\nDataset automatically created during the evaluation run of model Manolo26/metis-chat-instruct-7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-02-02T00:02:59.237994(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of Manolo26/metis-chat-instruct-7b\n\n\n\nDataset automatically created during the evaluation run of model Manolo26/metis-chat-instruct-7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-02-02T00:02:59.237994(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
3a05c2fe0e877e01d5ada208671fcb10a9c03124
# Dataset Card for Evaluation run of ewqr2130/llama_ppo_1e6step_4000 <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [ewqr2130/llama_ppo_1e6step_4000](https://huggingface.co/ewqr2130/llama_ppo_1e6step_4000) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_ewqr2130__llama_ppo_1e6step_4000", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-02-02T00:09:18.909168](https://huggingface.co/datasets/open-llm-leaderboard/details_ewqr2130__llama_ppo_1e6step_4000/blob/main/results_2024-02-02T00-09-18.909168.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.4691523181335285, "acc_stderr": 0.034463131214080095, "acc_norm": 0.4740999577703851, "acc_norm_stderr": 0.03524404756511876, "mc1": 0.29008567931456547, "mc1_stderr": 0.01588623687420952, "mc2": 0.4124263540715807, "mc2_stderr": 0.014353328846214618 }, "harness|arc:challenge|25": { "acc": 0.5042662116040956, "acc_stderr": 0.014610858923956959, "acc_norm": 0.5443686006825939, "acc_norm_stderr": 0.014553749939306861 }, "harness|hellaswag|10": { "acc": 0.5871340370444135, "acc_stderr": 0.004913429010559069, "acc_norm": 0.7865962955586536, "acc_norm_stderr": 0.004088730085367326 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.33, "acc_stderr": 0.047258156262526045, "acc_norm": 0.33, "acc_norm_stderr": 0.047258156262526045 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.43703703703703706, "acc_stderr": 0.04284958639753399, "acc_norm": 0.43703703703703706, "acc_norm_stderr": 0.04284958639753399 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.3881578947368421, "acc_stderr": 0.03965842097512744, "acc_norm": 0.3881578947368421, "acc_norm_stderr": 0.03965842097512744 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.5, "acc_stderr": 0.050251890762960605, "acc_norm": 0.5, "acc_norm_stderr": 0.050251890762960605 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.44528301886792454, "acc_stderr": 0.030588052974270655, "acc_norm": 0.44528301886792454, "acc_norm_stderr": 0.030588052974270655 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.4652777777777778, "acc_stderr": 0.04171115858181618, "acc_norm": 0.4652777777777778, "acc_norm_stderr": 0.04171115858181618 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.31, "acc_stderr": 0.04648231987117316, "acc_norm": 0.31, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.42, "acc_stderr": 0.049604496374885836, "acc_norm": 0.42, "acc_norm_stderr": 0.049604496374885836 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.27, "acc_stderr": 0.044619604333847394, "acc_norm": 0.27, "acc_norm_stderr": 0.044619604333847394 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.43352601156069365, "acc_stderr": 0.03778621079092055, "acc_norm": 0.43352601156069365, "acc_norm_stderr": 0.03778621079092055 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.21568627450980393, "acc_stderr": 0.04092563958237654, "acc_norm": 0.21568627450980393, "acc_norm_stderr": 0.04092563958237654 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.62, "acc_stderr": 0.048783173121456316, "acc_norm": 0.62, "acc_norm_stderr": 0.048783173121456316 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.4297872340425532, "acc_stderr": 0.03236214467715563, "acc_norm": 0.4297872340425532, "acc_norm_stderr": 0.03236214467715563 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.3333333333333333, "acc_stderr": 0.044346007015849245, "acc_norm": 0.3333333333333333, "acc_norm_stderr": 0.044346007015849245 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.46206896551724136, "acc_stderr": 0.041546596717075474, "acc_norm": 0.46206896551724136, "acc_norm_stderr": 0.041546596717075474 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.26455026455026454, "acc_stderr": 0.022717467897708628, "acc_norm": 0.26455026455026454, "acc_norm_stderr": 0.022717467897708628 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.3412698412698413, "acc_stderr": 0.04240799327574924, "acc_norm": 0.3412698412698413, "acc_norm_stderr": 0.04240799327574924 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.33, "acc_stderr": 0.04725815626252606, "acc_norm": 0.33, "acc_norm_stderr": 0.04725815626252606 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.4870967741935484, "acc_stderr": 0.028434533152681848, "acc_norm": 0.4870967741935484, "acc_norm_stderr": 0.028434533152681848 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.3694581280788177, "acc_stderr": 0.03395970381998573, "acc_norm": 0.3694581280788177, "acc_norm_stderr": 0.03395970381998573 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.43, "acc_stderr": 0.049756985195624284, "acc_norm": 0.43, "acc_norm_stderr": 0.049756985195624284 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.6121212121212121, "acc_stderr": 0.038049136539710114, "acc_norm": 0.6121212121212121, "acc_norm_stderr": 0.038049136539710114 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.5, "acc_stderr": 0.035623524993954825, "acc_norm": 0.5, "acc_norm_stderr": 0.035623524993954825 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.6580310880829016, "acc_stderr": 0.03423465100104282, "acc_norm": 0.6580310880829016, "acc_norm_stderr": 0.03423465100104282 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.4358974358974359, "acc_stderr": 0.02514180151117749, "acc_norm": 0.4358974358974359, "acc_norm_stderr": 0.02514180151117749 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.2851851851851852, "acc_stderr": 0.0275285992103405, "acc_norm": 0.2851851851851852, "acc_norm_stderr": 0.0275285992103405 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.39915966386554624, "acc_stderr": 0.031811100324139245, "acc_norm": 0.39915966386554624, "acc_norm_stderr": 0.031811100324139245 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.304635761589404, "acc_stderr": 0.03757949922943342, "acc_norm": 0.304635761589404, "acc_norm_stderr": 0.03757949922943342 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.6238532110091743, "acc_stderr": 0.02076923196820508, "acc_norm": 0.6238532110091743, "acc_norm_stderr": 0.02076923196820508 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.25462962962962965, "acc_stderr": 0.029711275860005357, "acc_norm": 0.25462962962962965, "acc_norm_stderr": 0.029711275860005357 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.5882352941176471, "acc_stderr": 0.03454236585380609, "acc_norm": 0.5882352941176471, "acc_norm_stderr": 0.03454236585380609 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.6497890295358649, "acc_stderr": 0.031052391937584346, "acc_norm": 0.6497890295358649, "acc_norm_stderr": 0.031052391937584346 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.5650224215246636, "acc_stderr": 0.033272833702713445, "acc_norm": 0.5650224215246636, "acc_norm_stderr": 0.033272833702713445 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.5114503816793893, "acc_stderr": 0.04384140024078016, "acc_norm": 0.5114503816793893, "acc_norm_stderr": 0.04384140024078016 }, "harness|hendrycksTest-international_law|5": { "acc": 0.6033057851239669, "acc_stderr": 0.044658697805310094, "acc_norm": 0.6033057851239669, "acc_norm_stderr": 0.044658697805310094 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.5277777777777778, "acc_stderr": 0.048262172941398944, "acc_norm": 0.5277777777777778, "acc_norm_stderr": 0.048262172941398944 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.5153374233128835, "acc_stderr": 0.03926522378708843, "acc_norm": 0.5153374233128835, "acc_norm_stderr": 0.03926522378708843 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.4017857142857143, "acc_stderr": 0.04653333146973646, "acc_norm": 0.4017857142857143, "acc_norm_stderr": 0.04653333146973646 }, "harness|hendrycksTest-management|5": { "acc": 0.5339805825242718, "acc_stderr": 0.0493929144727348, "acc_norm": 0.5339805825242718, "acc_norm_stderr": 0.0493929144727348 }, "harness|hendrycksTest-marketing|5": { "acc": 0.7393162393162394, "acc_stderr": 0.028760348956523414, "acc_norm": 0.7393162393162394, "acc_norm_stderr": 0.028760348956523414 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.53, "acc_stderr": 0.05016135580465919, "acc_norm": 0.53, "acc_norm_stderr": 0.05016135580465919 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.648786717752235, "acc_stderr": 0.01706998205149943, "acc_norm": 0.648786717752235, "acc_norm_stderr": 0.01706998205149943 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.5173410404624278, "acc_stderr": 0.02690290045866664, "acc_norm": 0.5173410404624278, "acc_norm_stderr": 0.02690290045866664 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.25921787709497207, "acc_stderr": 0.014655780837497738, "acc_norm": 0.25921787709497207, "acc_norm_stderr": 0.014655780837497738 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.4934640522875817, "acc_stderr": 0.028627470550556047, "acc_norm": 0.4934640522875817, "acc_norm_stderr": 0.028627470550556047 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.6205787781350482, "acc_stderr": 0.02755994980234782, "acc_norm": 0.6205787781350482, "acc_norm_stderr": 0.02755994980234782 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.5277777777777778, "acc_stderr": 0.027777777777777797, "acc_norm": 0.5277777777777778, "acc_norm_stderr": 0.027777777777777797 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.3475177304964539, "acc_stderr": 0.028406627809590954, "acc_norm": 0.3475177304964539, "acc_norm_stderr": 0.028406627809590954 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.3474576271186441, "acc_stderr": 0.0121614177297498, "acc_norm": 0.3474576271186441, "acc_norm_stderr": 0.0121614177297498 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.46691176470588236, "acc_stderr": 0.030306257722468307, "acc_norm": 0.46691176470588236, "acc_norm_stderr": 0.030306257722468307 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.4673202614379085, "acc_stderr": 0.020184583359102202, "acc_norm": 0.4673202614379085, "acc_norm_stderr": 0.020184583359102202 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.5727272727272728, "acc_stderr": 0.04738198703545483, "acc_norm": 0.5727272727272728, "acc_norm_stderr": 0.04738198703545483 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.45714285714285713, "acc_stderr": 0.03189141832421397, "acc_norm": 0.45714285714285713, "acc_norm_stderr": 0.03189141832421397 }, "harness|hendrycksTest-sociology|5": { "acc": 0.6268656716417911, "acc_stderr": 0.03419832608176008, "acc_norm": 0.6268656716417911, "acc_norm_stderr": 0.03419832608176008 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.69, "acc_stderr": 0.04648231987117316, "acc_norm": 0.69, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-virology|5": { "acc": 0.39156626506024095, "acc_stderr": 0.03799857454479637, "acc_norm": 0.39156626506024095, "acc_norm_stderr": 0.03799857454479637 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.6842105263157895, "acc_stderr": 0.03565079670708311, "acc_norm": 0.6842105263157895, "acc_norm_stderr": 0.03565079670708311 }, "harness|truthfulqa:mc|0": { "mc1": 0.29008567931456547, "mc1_stderr": 0.01588623687420952, "mc2": 0.4124263540715807, "mc2_stderr": 0.014353328846214618 }, "harness|winogrande|5": { "acc": 0.7419100236779794, "acc_stderr": 0.012298278833972394 }, "harness|gsm8k|5": { "acc": 0.14404852160727824, "acc_stderr": 0.009672110973065286 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_ewqr2130__llama_ppo_1e6step_4000
[ "region:us" ]
2024-02-02T00:11:45+00:00
{"pretty_name": "Evaluation run of ewqr2130/llama_ppo_1e6step_4000", "dataset_summary": "Dataset automatically created during the evaluation run of model [ewqr2130/llama_ppo_1e6step_4000](https://huggingface.co/ewqr2130/llama_ppo_1e6step_4000) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_ewqr2130__llama_ppo_1e6step_4000\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-02T00:09:18.909168](https://huggingface.co/datasets/open-llm-leaderboard/details_ewqr2130__llama_ppo_1e6step_4000/blob/main/results_2024-02-02T00-09-18.909168.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.4691523181335285,\n \"acc_stderr\": 0.034463131214080095,\n \"acc_norm\": 0.4740999577703851,\n \"acc_norm_stderr\": 0.03524404756511876,\n \"mc1\": 0.29008567931456547,\n \"mc1_stderr\": 0.01588623687420952,\n \"mc2\": 0.4124263540715807,\n \"mc2_stderr\": 0.014353328846214618\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5042662116040956,\n \"acc_stderr\": 0.014610858923956959,\n \"acc_norm\": 0.5443686006825939,\n \"acc_norm_stderr\": 0.014553749939306861\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5871340370444135,\n \"acc_stderr\": 0.004913429010559069,\n \"acc_norm\": 0.7865962955586536,\n \"acc_norm_stderr\": 0.004088730085367326\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.43703703703703706,\n \"acc_stderr\": 0.04284958639753399,\n \"acc_norm\": 0.43703703703703706,\n \"acc_norm_stderr\": 0.04284958639753399\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.3881578947368421,\n \"acc_stderr\": 0.03965842097512744,\n \"acc_norm\": 0.3881578947368421,\n \"acc_norm_stderr\": 0.03965842097512744\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.44528301886792454,\n \"acc_stderr\": 0.030588052974270655,\n \"acc_norm\": 0.44528301886792454,\n \"acc_norm_stderr\": 0.030588052974270655\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4652777777777778,\n \"acc_stderr\": 0.04171115858181618,\n \"acc_norm\": 0.4652777777777778,\n \"acc_norm_stderr\": 0.04171115858181618\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.43352601156069365,\n \"acc_stderr\": 0.03778621079092055,\n \"acc_norm\": 0.43352601156069365,\n \"acc_norm_stderr\": 0.03778621079092055\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.04092563958237654,\n \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.04092563958237654\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.62,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.62,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.4297872340425532,\n \"acc_stderr\": 0.03236214467715563,\n \"acc_norm\": 0.4297872340425532,\n \"acc_norm_stderr\": 0.03236214467715563\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.044346007015849245,\n \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.044346007015849245\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.46206896551724136,\n \"acc_stderr\": 0.041546596717075474,\n \"acc_norm\": 0.46206896551724136,\n \"acc_norm_stderr\": 0.041546596717075474\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.26455026455026454,\n \"acc_stderr\": 0.022717467897708628,\n \"acc_norm\": 0.26455026455026454,\n \"acc_norm_stderr\": 0.022717467897708628\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3412698412698413,\n \"acc_stderr\": 0.04240799327574924,\n \"acc_norm\": 0.3412698412698413,\n \"acc_norm_stderr\": 0.04240799327574924\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252606,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252606\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.4870967741935484,\n \"acc_stderr\": 0.028434533152681848,\n \"acc_norm\": 0.4870967741935484,\n \"acc_norm_stderr\": 0.028434533152681848\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.3694581280788177,\n \"acc_stderr\": 0.03395970381998573,\n \"acc_norm\": 0.3694581280788177,\n \"acc_norm_stderr\": 0.03395970381998573\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.6121212121212121,\n \"acc_stderr\": 0.038049136539710114,\n \"acc_norm\": 0.6121212121212121,\n \"acc_norm_stderr\": 0.038049136539710114\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.035623524993954825,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.035623524993954825\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.6580310880829016,\n \"acc_stderr\": 0.03423465100104282,\n \"acc_norm\": 0.6580310880829016,\n \"acc_norm_stderr\": 0.03423465100104282\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.4358974358974359,\n \"acc_stderr\": 0.02514180151117749,\n \"acc_norm\": 0.4358974358974359,\n \"acc_norm_stderr\": 0.02514180151117749\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.2851851851851852,\n \"acc_stderr\": 0.0275285992103405,\n \"acc_norm\": 0.2851851851851852,\n \"acc_norm_stderr\": 0.0275285992103405\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.39915966386554624,\n \"acc_stderr\": 0.031811100324139245,\n \"acc_norm\": 0.39915966386554624,\n \"acc_norm_stderr\": 0.031811100324139245\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.304635761589404,\n \"acc_stderr\": 0.03757949922943342,\n \"acc_norm\": 0.304635761589404,\n \"acc_norm_stderr\": 0.03757949922943342\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.6238532110091743,\n \"acc_stderr\": 0.02076923196820508,\n \"acc_norm\": 0.6238532110091743,\n \"acc_norm_stderr\": 0.02076923196820508\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.25462962962962965,\n \"acc_stderr\": 0.029711275860005357,\n \"acc_norm\": 0.25462962962962965,\n \"acc_norm_stderr\": 0.029711275860005357\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.5882352941176471,\n \"acc_stderr\": 0.03454236585380609,\n \"acc_norm\": 0.5882352941176471,\n \"acc_norm_stderr\": 0.03454236585380609\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.6497890295358649,\n \"acc_stderr\": 0.031052391937584346,\n \"acc_norm\": 0.6497890295358649,\n \"acc_norm_stderr\": 0.031052391937584346\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5650224215246636,\n \"acc_stderr\": 0.033272833702713445,\n \"acc_norm\": 0.5650224215246636,\n \"acc_norm_stderr\": 0.033272833702713445\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.5114503816793893,\n \"acc_stderr\": 0.04384140024078016,\n \"acc_norm\": 0.5114503816793893,\n \"acc_norm_stderr\": 0.04384140024078016\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.6033057851239669,\n \"acc_stderr\": 0.044658697805310094,\n \"acc_norm\": 0.6033057851239669,\n \"acc_norm_stderr\": 0.044658697805310094\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5277777777777778,\n \"acc_stderr\": 0.048262172941398944,\n \"acc_norm\": 0.5277777777777778,\n \"acc_norm_stderr\": 0.048262172941398944\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.5153374233128835,\n \"acc_stderr\": 0.03926522378708843,\n \"acc_norm\": 0.5153374233128835,\n \"acc_norm_stderr\": 0.03926522378708843\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4017857142857143,\n \"acc_stderr\": 0.04653333146973646,\n \"acc_norm\": 0.4017857142857143,\n \"acc_norm_stderr\": 0.04653333146973646\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.5339805825242718,\n \"acc_stderr\": 0.0493929144727348,\n \"acc_norm\": 0.5339805825242718,\n \"acc_norm_stderr\": 0.0493929144727348\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7393162393162394,\n \"acc_stderr\": 0.028760348956523414,\n \"acc_norm\": 0.7393162393162394,\n \"acc_norm_stderr\": 0.028760348956523414\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.648786717752235,\n \"acc_stderr\": 0.01706998205149943,\n \"acc_norm\": 0.648786717752235,\n \"acc_norm_stderr\": 0.01706998205149943\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.5173410404624278,\n \"acc_stderr\": 0.02690290045866664,\n \"acc_norm\": 0.5173410404624278,\n \"acc_norm_stderr\": 0.02690290045866664\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.25921787709497207,\n \"acc_stderr\": 0.014655780837497738,\n \"acc_norm\": 0.25921787709497207,\n \"acc_norm_stderr\": 0.014655780837497738\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.4934640522875817,\n \"acc_stderr\": 0.028627470550556047,\n \"acc_norm\": 0.4934640522875817,\n \"acc_norm_stderr\": 0.028627470550556047\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6205787781350482,\n \"acc_stderr\": 0.02755994980234782,\n \"acc_norm\": 0.6205787781350482,\n \"acc_norm_stderr\": 0.02755994980234782\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.5277777777777778,\n \"acc_stderr\": 0.027777777777777797,\n \"acc_norm\": 0.5277777777777778,\n \"acc_norm_stderr\": 0.027777777777777797\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.3475177304964539,\n \"acc_stderr\": 0.028406627809590954,\n \"acc_norm\": 0.3475177304964539,\n \"acc_norm_stderr\": 0.028406627809590954\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3474576271186441,\n \"acc_stderr\": 0.0121614177297498,\n \"acc_norm\": 0.3474576271186441,\n \"acc_norm_stderr\": 0.0121614177297498\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.46691176470588236,\n \"acc_stderr\": 0.030306257722468307,\n \"acc_norm\": 0.46691176470588236,\n \"acc_norm_stderr\": 0.030306257722468307\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.4673202614379085,\n \"acc_stderr\": 0.020184583359102202,\n \"acc_norm\": 0.4673202614379085,\n \"acc_norm_stderr\": 0.020184583359102202\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5727272727272728,\n \"acc_stderr\": 0.04738198703545483,\n \"acc_norm\": 0.5727272727272728,\n \"acc_norm_stderr\": 0.04738198703545483\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.45714285714285713,\n \"acc_stderr\": 0.03189141832421397,\n \"acc_norm\": 0.45714285714285713,\n \"acc_norm_stderr\": 0.03189141832421397\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6268656716417911,\n \"acc_stderr\": 0.03419832608176008,\n \"acc_norm\": 0.6268656716417911,\n \"acc_norm_stderr\": 0.03419832608176008\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.39156626506024095,\n \"acc_stderr\": 0.03799857454479637,\n \"acc_norm\": 0.39156626506024095,\n \"acc_norm_stderr\": 0.03799857454479637\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.6842105263157895,\n \"acc_stderr\": 0.03565079670708311,\n \"acc_norm\": 0.6842105263157895,\n \"acc_norm_stderr\": 0.03565079670708311\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.29008567931456547,\n \"mc1_stderr\": 0.01588623687420952,\n \"mc2\": 0.4124263540715807,\n \"mc2_stderr\": 0.014353328846214618\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7419100236779794,\n \"acc_stderr\": 0.012298278833972394\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.14404852160727824,\n \"acc_stderr\": 0.009672110973065286\n }\n}\n```", "repo_url": "https://huggingface.co/ewqr2130/llama_ppo_1e6step_4000", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_02T00_09_18.909168", "path": ["**/details_harness|arc:challenge|25_2024-02-02T00-09-18.909168.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-02T00-09-18.909168.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_02T00_09_18.909168", "path": ["**/details_harness|gsm8k|5_2024-02-02T00-09-18.909168.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-02T00-09-18.909168.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_02T00_09_18.909168", "path": ["**/details_harness|hellaswag|10_2024-02-02T00-09-18.909168.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-02T00-09-18.909168.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_02T00_09_18.909168", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T00-09-18.909168.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-02T00-09-18.909168.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-02T00-09-18.909168.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T00-09-18.909168.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T00-09-18.909168.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-02T00-09-18.909168.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T00-09-18.909168.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T00-09-18.909168.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T00-09-18.909168.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T00-09-18.909168.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-02T00-09-18.909168.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-02T00-09-18.909168.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T00-09-18.909168.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-02T00-09-18.909168.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T00-09-18.909168.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T00-09-18.909168.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T00-09-18.909168.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-02T00-09-18.909168.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T00-09-18.909168.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T00-09-18.909168.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T00-09-18.909168.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T00-09-18.909168.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T00-09-18.909168.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T00-09-18.909168.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T00-09-18.909168.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T00-09-18.909168.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T00-09-18.909168.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T00-09-18.909168.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T00-09-18.909168.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T00-09-18.909168.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T00-09-18.909168.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T00-09-18.909168.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-02T00-09-18.909168.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T00-09-18.909168.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-02T00-09-18.909168.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T00-09-18.909168.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T00-09-18.909168.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T00-09-18.909168.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-02T00-09-18.909168.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-02T00-09-18.909168.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T00-09-18.909168.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T00-09-18.909168.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T00-09-18.909168.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T00-09-18.909168.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-02T00-09-18.909168.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-02T00-09-18.909168.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-02T00-09-18.909168.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T00-09-18.909168.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-02T00-09-18.909168.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T00-09-18.909168.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T00-09-18.909168.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-02T00-09-18.909168.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-02T00-09-18.909168.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-02T00-09-18.909168.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T00-09-18.909168.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-02T00-09-18.909168.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-02T00-09-18.909168.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T00-09-18.909168.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-02T00-09-18.909168.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-02T00-09-18.909168.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T00-09-18.909168.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T00-09-18.909168.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-02T00-09-18.909168.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T00-09-18.909168.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T00-09-18.909168.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T00-09-18.909168.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T00-09-18.909168.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-02T00-09-18.909168.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-02T00-09-18.909168.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T00-09-18.909168.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-02T00-09-18.909168.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T00-09-18.909168.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T00-09-18.909168.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T00-09-18.909168.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-02T00-09-18.909168.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T00-09-18.909168.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T00-09-18.909168.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T00-09-18.909168.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T00-09-18.909168.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T00-09-18.909168.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T00-09-18.909168.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T00-09-18.909168.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T00-09-18.909168.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T00-09-18.909168.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T00-09-18.909168.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T00-09-18.909168.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T00-09-18.909168.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T00-09-18.909168.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T00-09-18.909168.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-02T00-09-18.909168.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T00-09-18.909168.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-02T00-09-18.909168.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T00-09-18.909168.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T00-09-18.909168.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T00-09-18.909168.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-02T00-09-18.909168.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-02T00-09-18.909168.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T00-09-18.909168.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T00-09-18.909168.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T00-09-18.909168.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T00-09-18.909168.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-02T00-09-18.909168.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-02T00-09-18.909168.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-02T00-09-18.909168.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T00-09-18.909168.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-02T00-09-18.909168.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T00-09-18.909168.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T00-09-18.909168.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-02T00-09-18.909168.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-02T00-09-18.909168.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-02T00-09-18.909168.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T00-09-18.909168.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-02T00-09-18.909168.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-02T00-09-18.909168.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_02T00_09_18.909168", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T00-09-18.909168.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T00-09-18.909168.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_02T00_09_18.909168", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-02T00-09-18.909168.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-02T00-09-18.909168.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_02T00_09_18.909168", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-02T00-09-18.909168.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-02T00-09-18.909168.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_02T00_09_18.909168", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T00-09-18.909168.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T00-09-18.909168.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_02T00_09_18.909168", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T00-09-18.909168.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T00-09-18.909168.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_02T00_09_18.909168", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-02T00-09-18.909168.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-02T00-09-18.909168.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_02T00_09_18.909168", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T00-09-18.909168.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T00-09-18.909168.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_02T00_09_18.909168", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T00-09-18.909168.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T00-09-18.909168.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_02T00_09_18.909168", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T00-09-18.909168.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T00-09-18.909168.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_02T00_09_18.909168", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T00-09-18.909168.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T00-09-18.909168.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_02T00_09_18.909168", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-02T00-09-18.909168.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-02T00-09-18.909168.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_02T00_09_18.909168", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-02T00-09-18.909168.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-02T00-09-18.909168.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_02T00_09_18.909168", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T00-09-18.909168.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T00-09-18.909168.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_02T00_09_18.909168", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-02T00-09-18.909168.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-02T00-09-18.909168.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_02T00_09_18.909168", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T00-09-18.909168.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T00-09-18.909168.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_02T00_09_18.909168", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T00-09-18.909168.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T00-09-18.909168.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_02T00_09_18.909168", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T00-09-18.909168.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T00-09-18.909168.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_02T00_09_18.909168", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-02T00-09-18.909168.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-02T00-09-18.909168.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_02T00_09_18.909168", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T00-09-18.909168.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T00-09-18.909168.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_02T00_09_18.909168", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T00-09-18.909168.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T00-09-18.909168.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_02T00_09_18.909168", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T00-09-18.909168.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T00-09-18.909168.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_02T00_09_18.909168", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T00-09-18.909168.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T00-09-18.909168.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_02T00_09_18.909168", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T00-09-18.909168.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T00-09-18.909168.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_02T00_09_18.909168", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T00-09-18.909168.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T00-09-18.909168.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_02T00_09_18.909168", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T00-09-18.909168.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T00-09-18.909168.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_02T00_09_18.909168", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T00-09-18.909168.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T00-09-18.909168.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_02T00_09_18.909168", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T00-09-18.909168.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T00-09-18.909168.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_02T00_09_18.909168", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T00-09-18.909168.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T00-09-18.909168.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_02T00_09_18.909168", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T00-09-18.909168.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T00-09-18.909168.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_02T00_09_18.909168", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T00-09-18.909168.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T00-09-18.909168.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_02T00_09_18.909168", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T00-09-18.909168.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T00-09-18.909168.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_02T00_09_18.909168", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T00-09-18.909168.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T00-09-18.909168.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_02T00_09_18.909168", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-02T00-09-18.909168.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-02T00-09-18.909168.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_02T00_09_18.909168", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T00-09-18.909168.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T00-09-18.909168.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_02T00_09_18.909168", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-02T00-09-18.909168.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-02T00-09-18.909168.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_02T00_09_18.909168", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T00-09-18.909168.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T00-09-18.909168.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_02T00_09_18.909168", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T00-09-18.909168.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T00-09-18.909168.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_02T00_09_18.909168", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T00-09-18.909168.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T00-09-18.909168.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_02T00_09_18.909168", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-02T00-09-18.909168.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-02T00-09-18.909168.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_02T00_09_18.909168", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-02T00-09-18.909168.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-02T00-09-18.909168.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_02T00_09_18.909168", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T00-09-18.909168.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T00-09-18.909168.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_02T00_09_18.909168", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T00-09-18.909168.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T00-09-18.909168.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_02T00_09_18.909168", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T00-09-18.909168.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T00-09-18.909168.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_02T00_09_18.909168", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T00-09-18.909168.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T00-09-18.909168.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_02T00_09_18.909168", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-02T00-09-18.909168.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-02T00-09-18.909168.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_02T00_09_18.909168", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-02T00-09-18.909168.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-02T00-09-18.909168.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_02T00_09_18.909168", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-02T00-09-18.909168.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-02T00-09-18.909168.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_02T00_09_18.909168", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T00-09-18.909168.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T00-09-18.909168.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_02T00_09_18.909168", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-02T00-09-18.909168.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-02T00-09-18.909168.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_02T00_09_18.909168", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T00-09-18.909168.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T00-09-18.909168.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_02T00_09_18.909168", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T00-09-18.909168.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T00-09-18.909168.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_02T00_09_18.909168", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-02T00-09-18.909168.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-02T00-09-18.909168.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_02T00_09_18.909168", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-02T00-09-18.909168.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-02T00-09-18.909168.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_02T00_09_18.909168", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-02T00-09-18.909168.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-02T00-09-18.909168.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_02T00_09_18.909168", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T00-09-18.909168.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T00-09-18.909168.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_02T00_09_18.909168", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-02T00-09-18.909168.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-02T00-09-18.909168.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_02T00_09_18.909168", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-02T00-09-18.909168.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-02T00-09-18.909168.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_02T00_09_18.909168", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-02T00-09-18.909168.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-02T00-09-18.909168.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_02T00_09_18.909168", "path": ["**/details_harness|winogrande|5_2024-02-02T00-09-18.909168.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-02T00-09-18.909168.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_02T00_09_18.909168", "path": ["results_2024-02-02T00-09-18.909168.parquet"]}, {"split": "latest", "path": ["results_2024-02-02T00-09-18.909168.parquet"]}]}]}
2024-02-02T00:12:11+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of ewqr2130/llama_ppo_1e6step_4000 Dataset automatically created during the evaluation run of model ewqr2130/llama_ppo_1e6step_4000 on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-02-02T00:09:18.909168(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of ewqr2130/llama_ppo_1e6step_4000\n\n\n\nDataset automatically created during the evaluation run of model ewqr2130/llama_ppo_1e6step_4000 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-02-02T00:09:18.909168(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of ewqr2130/llama_ppo_1e6step_4000\n\n\n\nDataset automatically created during the evaluation run of model ewqr2130/llama_ppo_1e6step_4000 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-02-02T00:09:18.909168(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
97ed5b076f87dc6fbc75d95d52ee969d55d51ac3
# Dataset Card for Evaluation run of Zangs3011/mixtral_8x7b_MonsterInstruct <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [Zangs3011/mixtral_8x7b_MonsterInstruct](https://huggingface.co/Zangs3011/mixtral_8x7b_MonsterInstruct) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_Zangs3011__mixtral_8x7b_MonsterInstruct", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-02-02T00:13:11.884306](https://huggingface.co/datasets/open-llm-leaderboard/details_Zangs3011__mixtral_8x7b_MonsterInstruct/blob/main/results_2024-02-02T00-13-11.884306.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6972191733645681, "acc_stderr": 0.0306137906197003, "acc_norm": 0.7032908621207318, "acc_norm_stderr": 0.03120147743682294, "mc1": 0.3378212974296206, "mc1_stderr": 0.016557167322516886, "mc2": 0.4847358578212202, "mc2_stderr": 0.014200020930273186 }, "harness|arc:challenge|25": { "acc": 0.6092150170648464, "acc_stderr": 0.014258563880513782, "acc_norm": 0.6518771331058021, "acc_norm_stderr": 0.013921008595179344 }, "harness|hellaswag|10": { "acc": 0.6528579964150567, "acc_stderr": 0.004750884401095162, "acc_norm": 0.8580959968133838, "acc_norm_stderr": 0.003482384956632783 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.32, "acc_stderr": 0.046882617226215034, "acc_norm": 0.32, "acc_norm_stderr": 0.046882617226215034 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.7111111111111111, "acc_stderr": 0.03915450630414251, "acc_norm": 0.7111111111111111, "acc_norm_stderr": 0.03915450630414251 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.8157894736842105, "acc_stderr": 0.0315469804508223, "acc_norm": 0.8157894736842105, "acc_norm_stderr": 0.0315469804508223 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.65, "acc_stderr": 0.047937248544110196, "acc_norm": 0.65, "acc_norm_stderr": 0.047937248544110196 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.7660377358490567, "acc_stderr": 0.02605529690115292, "acc_norm": 0.7660377358490567, "acc_norm_stderr": 0.02605529690115292 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.8125, "acc_stderr": 0.032639560491693344, "acc_norm": 0.8125, "acc_norm_stderr": 0.032639560491693344 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.55, "acc_stderr": 0.049999999999999996, "acc_norm": 0.55, "acc_norm_stderr": 0.049999999999999996 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.63, "acc_stderr": 0.048523658709391, "acc_norm": 0.63, "acc_norm_stderr": 0.048523658709391 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.42, "acc_stderr": 0.049604496374885836, "acc_norm": 0.42, "acc_norm_stderr": 0.049604496374885836 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6994219653179191, "acc_stderr": 0.0349610148119118, "acc_norm": 0.6994219653179191, "acc_norm_stderr": 0.0349610148119118 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.4411764705882353, "acc_stderr": 0.049406356306056595, "acc_norm": 0.4411764705882353, "acc_norm_stderr": 0.049406356306056595 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.83, "acc_stderr": 0.03775251680686371, "acc_norm": 0.83, "acc_norm_stderr": 0.03775251680686371 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.6893617021276596, "acc_stderr": 0.03025123757921317, "acc_norm": 0.6893617021276596, "acc_norm_stderr": 0.03025123757921317 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.5789473684210527, "acc_stderr": 0.04644602091222317, "acc_norm": 0.5789473684210527, "acc_norm_stderr": 0.04644602091222317 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.6482758620689655, "acc_stderr": 0.03979236637497411, "acc_norm": 0.6482758620689655, "acc_norm_stderr": 0.03979236637497411 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.455026455026455, "acc_stderr": 0.025646928361049398, "acc_norm": 0.455026455026455, "acc_norm_stderr": 0.025646928361049398 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.5238095238095238, "acc_stderr": 0.04467062628403273, "acc_norm": 0.5238095238095238, "acc_norm_stderr": 0.04467062628403273 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.43, "acc_stderr": 0.049756985195624284, "acc_norm": 0.43, "acc_norm_stderr": 0.049756985195624284 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.8225806451612904, "acc_stderr": 0.02173254068932928, "acc_norm": 0.8225806451612904, "acc_norm_stderr": 0.02173254068932928 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.6403940886699507, "acc_stderr": 0.03376458246509567, "acc_norm": 0.6403940886699507, "acc_norm_stderr": 0.03376458246509567 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.78, "acc_stderr": 0.04163331998932263, "acc_norm": 0.78, "acc_norm_stderr": 0.04163331998932263 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.806060606060606, "acc_stderr": 0.030874145136562076, "acc_norm": 0.806060606060606, "acc_norm_stderr": 0.030874145136562076 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.8232323232323232, "acc_stderr": 0.027178752639044915, "acc_norm": 0.8232323232323232, "acc_norm_stderr": 0.027178752639044915 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.9222797927461139, "acc_stderr": 0.019321805557223164, "acc_norm": 0.9222797927461139, "acc_norm_stderr": 0.019321805557223164 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.676923076923077, "acc_stderr": 0.023710888501970565, "acc_norm": 0.676923076923077, "acc_norm_stderr": 0.023710888501970565 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.3888888888888889, "acc_stderr": 0.029723278961476668, "acc_norm": 0.3888888888888889, "acc_norm_stderr": 0.029723278961476668 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.7352941176470589, "acc_stderr": 0.02865749128507198, "acc_norm": 0.7352941176470589, "acc_norm_stderr": 0.02865749128507198 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.5165562913907285, "acc_stderr": 0.04080244185628972, "acc_norm": 0.5165562913907285, "acc_norm_stderr": 0.04080244185628972 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8660550458715597, "acc_stderr": 0.014602811435592635, "acc_norm": 0.8660550458715597, "acc_norm_stderr": 0.014602811435592635 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.5972222222222222, "acc_stderr": 0.03344887382997865, "acc_norm": 0.5972222222222222, "acc_norm_stderr": 0.03344887382997865 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.8431372549019608, "acc_stderr": 0.02552472232455334, "acc_norm": 0.8431372549019608, "acc_norm_stderr": 0.02552472232455334 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.8860759493670886, "acc_stderr": 0.020681745135884562, "acc_norm": 0.8860759493670886, "acc_norm_stderr": 0.020681745135884562 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.7488789237668162, "acc_stderr": 0.029105220833224615, "acc_norm": 0.7488789237668162, "acc_norm_stderr": 0.029105220833224615 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.816793893129771, "acc_stderr": 0.03392770926494732, "acc_norm": 0.816793893129771, "acc_norm_stderr": 0.03392770926494732 }, "harness|hendrycksTest-international_law|5": { "acc": 0.8677685950413223, "acc_stderr": 0.030922788320445805, "acc_norm": 0.8677685950413223, "acc_norm_stderr": 0.030922788320445805 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.8333333333333334, "acc_stderr": 0.03602814176392645, "acc_norm": 0.8333333333333334, "acc_norm_stderr": 0.03602814176392645 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7975460122699386, "acc_stderr": 0.031570650789119, "acc_norm": 0.7975460122699386, "acc_norm_stderr": 0.031570650789119 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.5, "acc_stderr": 0.04745789978762494, "acc_norm": 0.5, "acc_norm_stderr": 0.04745789978762494 }, "harness|hendrycksTest-management|5": { "acc": 0.8252427184466019, "acc_stderr": 0.037601780060266224, "acc_norm": 0.8252427184466019, "acc_norm_stderr": 0.037601780060266224 }, "harness|hendrycksTest-marketing|5": { "acc": 0.9188034188034188, "acc_stderr": 0.01789378490401852, "acc_norm": 0.9188034188034188, "acc_norm_stderr": 0.01789378490401852 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.76, "acc_stderr": 0.04292346959909282, "acc_norm": 0.76, "acc_norm_stderr": 0.04292346959909282 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8633461047254151, "acc_stderr": 0.012282876868629233, "acc_norm": 0.8633461047254151, "acc_norm_stderr": 0.012282876868629233 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.7687861271676301, "acc_stderr": 0.02269865716785571, "acc_norm": 0.7687861271676301, "acc_norm_stderr": 0.02269865716785571 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.4022346368715084, "acc_stderr": 0.01639971673284714, "acc_norm": 0.4022346368715084, "acc_norm_stderr": 0.01639971673284714 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.7581699346405228, "acc_stderr": 0.024518195641879334, "acc_norm": 0.7581699346405228, "acc_norm_stderr": 0.024518195641879334 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.7845659163987139, "acc_stderr": 0.023350225475471442, "acc_norm": 0.7845659163987139, "acc_norm_stderr": 0.023350225475471442 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.8425925925925926, "acc_stderr": 0.02026376499638572, "acc_norm": 0.8425925925925926, "acc_norm_stderr": 0.02026376499638572 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.5390070921985816, "acc_stderr": 0.02973659252642444, "acc_norm": 0.5390070921985816, "acc_norm_stderr": 0.02973659252642444 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.5221642764015645, "acc_stderr": 0.01275768304771618, "acc_norm": 0.5221642764015645, "acc_norm_stderr": 0.01275768304771618 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.7757352941176471, "acc_stderr": 0.025336848563332372, "acc_norm": 0.7757352941176471, "acc_norm_stderr": 0.025336848563332372 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.7598039215686274, "acc_stderr": 0.01728276069516741, "acc_norm": 0.7598039215686274, "acc_norm_stderr": 0.01728276069516741 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.7090909090909091, "acc_stderr": 0.04350271442923243, "acc_norm": 0.7090909090909091, "acc_norm_stderr": 0.04350271442923243 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.7959183673469388, "acc_stderr": 0.0258012834750905, "acc_norm": 0.7959183673469388, "acc_norm_stderr": 0.0258012834750905 }, "harness|hendrycksTest-sociology|5": { "acc": 0.8706467661691543, "acc_stderr": 0.023729830881018526, "acc_norm": 0.8706467661691543, "acc_norm_stderr": 0.023729830881018526 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.9, "acc_stderr": 0.03015113445777634, "acc_norm": 0.9, "acc_norm_stderr": 0.03015113445777634 }, "harness|hendrycksTest-virology|5": { "acc": 0.4879518072289157, "acc_stderr": 0.0389136449583582, "acc_norm": 0.4879518072289157, "acc_norm_stderr": 0.0389136449583582 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8596491228070176, "acc_stderr": 0.0266405825391332, "acc_norm": 0.8596491228070176, "acc_norm_stderr": 0.0266405825391332 }, "harness|truthfulqa:mc|0": { "mc1": 0.3378212974296206, "mc1_stderr": 0.016557167322516886, "mc2": 0.4847358578212202, "mc2_stderr": 0.014200020930273186 }, "harness|winogrande|5": { "acc": 0.8026835043409629, "acc_stderr": 0.011185026389050376 }, "harness|gsm8k|5": { "acc": 0.48142532221379836, "acc_stderr": 0.013762977910317584 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_Zangs3011__mixtral_8x7b_MonsterInstruct
[ "region:us" ]
2024-02-02T00:15:28+00:00
{"pretty_name": "Evaluation run of Zangs3011/mixtral_8x7b_MonsterInstruct", "dataset_summary": "Dataset automatically created during the evaluation run of model [Zangs3011/mixtral_8x7b_MonsterInstruct](https://huggingface.co/Zangs3011/mixtral_8x7b_MonsterInstruct) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Zangs3011__mixtral_8x7b_MonsterInstruct\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-02T00:13:11.884306](https://huggingface.co/datasets/open-llm-leaderboard/details_Zangs3011__mixtral_8x7b_MonsterInstruct/blob/main/results_2024-02-02T00-13-11.884306.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6972191733645681,\n \"acc_stderr\": 0.0306137906197003,\n \"acc_norm\": 0.7032908621207318,\n \"acc_norm_stderr\": 0.03120147743682294,\n \"mc1\": 0.3378212974296206,\n \"mc1_stderr\": 0.016557167322516886,\n \"mc2\": 0.4847358578212202,\n \"mc2_stderr\": 0.014200020930273186\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6092150170648464,\n \"acc_stderr\": 0.014258563880513782,\n \"acc_norm\": 0.6518771331058021,\n \"acc_norm_stderr\": 0.013921008595179344\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6528579964150567,\n \"acc_stderr\": 0.004750884401095162,\n \"acc_norm\": 0.8580959968133838,\n \"acc_norm_stderr\": 0.003482384956632783\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.7111111111111111,\n \"acc_stderr\": 0.03915450630414251,\n \"acc_norm\": 0.7111111111111111,\n \"acc_norm_stderr\": 0.03915450630414251\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.8157894736842105,\n \"acc_stderr\": 0.0315469804508223,\n \"acc_norm\": 0.8157894736842105,\n \"acc_norm_stderr\": 0.0315469804508223\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.65,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7660377358490567,\n \"acc_stderr\": 0.02605529690115292,\n \"acc_norm\": 0.7660377358490567,\n \"acc_norm_stderr\": 0.02605529690115292\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8125,\n \"acc_stderr\": 0.032639560491693344,\n \"acc_norm\": 0.8125,\n \"acc_norm_stderr\": 0.032639560491693344\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.55,\n \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.63,\n \"acc_stderr\": 0.048523658709391,\n \"acc_norm\": 0.63,\n \"acc_norm_stderr\": 0.048523658709391\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6994219653179191,\n \"acc_stderr\": 0.0349610148119118,\n \"acc_norm\": 0.6994219653179191,\n \"acc_norm_stderr\": 0.0349610148119118\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4411764705882353,\n \"acc_stderr\": 0.049406356306056595,\n \"acc_norm\": 0.4411764705882353,\n \"acc_norm_stderr\": 0.049406356306056595\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.83,\n \"acc_stderr\": 0.03775251680686371,\n \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.03775251680686371\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.6893617021276596,\n \"acc_stderr\": 0.03025123757921317,\n \"acc_norm\": 0.6893617021276596,\n \"acc_norm_stderr\": 0.03025123757921317\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5789473684210527,\n \"acc_stderr\": 0.04644602091222317,\n \"acc_norm\": 0.5789473684210527,\n \"acc_norm_stderr\": 0.04644602091222317\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.6482758620689655,\n \"acc_stderr\": 0.03979236637497411,\n \"acc_norm\": 0.6482758620689655,\n \"acc_norm_stderr\": 0.03979236637497411\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.455026455026455,\n \"acc_stderr\": 0.025646928361049398,\n \"acc_norm\": 0.455026455026455,\n \"acc_norm_stderr\": 0.025646928361049398\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5238095238095238,\n \"acc_stderr\": 0.04467062628403273,\n \"acc_norm\": 0.5238095238095238,\n \"acc_norm_stderr\": 0.04467062628403273\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8225806451612904,\n \"acc_stderr\": 0.02173254068932928,\n \"acc_norm\": 0.8225806451612904,\n \"acc_norm_stderr\": 0.02173254068932928\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.6403940886699507,\n \"acc_stderr\": 0.03376458246509567,\n \"acc_norm\": 0.6403940886699507,\n \"acc_norm_stderr\": 0.03376458246509567\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.78,\n \"acc_stderr\": 0.04163331998932263,\n \"acc_norm\": 0.78,\n \"acc_norm_stderr\": 0.04163331998932263\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.806060606060606,\n \"acc_stderr\": 0.030874145136562076,\n \"acc_norm\": 0.806060606060606,\n \"acc_norm_stderr\": 0.030874145136562076\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.8232323232323232,\n \"acc_stderr\": 0.027178752639044915,\n \"acc_norm\": 0.8232323232323232,\n \"acc_norm_stderr\": 0.027178752639044915\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9222797927461139,\n \"acc_stderr\": 0.019321805557223164,\n \"acc_norm\": 0.9222797927461139,\n \"acc_norm_stderr\": 0.019321805557223164\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.676923076923077,\n \"acc_stderr\": 0.023710888501970565,\n \"acc_norm\": 0.676923076923077,\n \"acc_norm_stderr\": 0.023710888501970565\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3888888888888889,\n \"acc_stderr\": 0.029723278961476668,\n \"acc_norm\": 0.3888888888888889,\n \"acc_norm_stderr\": 0.029723278961476668\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.7352941176470589,\n \"acc_stderr\": 0.02865749128507198,\n \"acc_norm\": 0.7352941176470589,\n \"acc_norm_stderr\": 0.02865749128507198\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.5165562913907285,\n \"acc_stderr\": 0.04080244185628972,\n \"acc_norm\": 0.5165562913907285,\n \"acc_norm_stderr\": 0.04080244185628972\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8660550458715597,\n \"acc_stderr\": 0.014602811435592635,\n \"acc_norm\": 0.8660550458715597,\n \"acc_norm_stderr\": 0.014602811435592635\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5972222222222222,\n \"acc_stderr\": 0.03344887382997865,\n \"acc_norm\": 0.5972222222222222,\n \"acc_norm_stderr\": 0.03344887382997865\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8431372549019608,\n \"acc_stderr\": 0.02552472232455334,\n \"acc_norm\": 0.8431372549019608,\n \"acc_norm_stderr\": 0.02552472232455334\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8860759493670886,\n \"acc_stderr\": 0.020681745135884562,\n \"acc_norm\": 0.8860759493670886,\n \"acc_norm_stderr\": 0.020681745135884562\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7488789237668162,\n \"acc_stderr\": 0.029105220833224615,\n \"acc_norm\": 0.7488789237668162,\n \"acc_norm_stderr\": 0.029105220833224615\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.816793893129771,\n \"acc_stderr\": 0.03392770926494732,\n \"acc_norm\": 0.816793893129771,\n \"acc_norm_stderr\": 0.03392770926494732\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8677685950413223,\n \"acc_stderr\": 0.030922788320445805,\n \"acc_norm\": 0.8677685950413223,\n \"acc_norm_stderr\": 0.030922788320445805\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8333333333333334,\n \"acc_stderr\": 0.03602814176392645,\n \"acc_norm\": 0.8333333333333334,\n \"acc_norm_stderr\": 0.03602814176392645\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7975460122699386,\n \"acc_stderr\": 0.031570650789119,\n \"acc_norm\": 0.7975460122699386,\n \"acc_norm_stderr\": 0.031570650789119\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.04745789978762494,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.04745789978762494\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8252427184466019,\n \"acc_stderr\": 0.037601780060266224,\n \"acc_norm\": 0.8252427184466019,\n \"acc_norm_stderr\": 0.037601780060266224\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9188034188034188,\n \"acc_stderr\": 0.01789378490401852,\n \"acc_norm\": 0.9188034188034188,\n \"acc_norm_stderr\": 0.01789378490401852\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.04292346959909282,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.04292346959909282\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8633461047254151,\n \"acc_stderr\": 0.012282876868629233,\n \"acc_norm\": 0.8633461047254151,\n \"acc_norm_stderr\": 0.012282876868629233\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7687861271676301,\n \"acc_stderr\": 0.02269865716785571,\n \"acc_norm\": 0.7687861271676301,\n \"acc_norm_stderr\": 0.02269865716785571\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4022346368715084,\n \"acc_stderr\": 0.01639971673284714,\n \"acc_norm\": 0.4022346368715084,\n \"acc_norm_stderr\": 0.01639971673284714\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7581699346405228,\n \"acc_stderr\": 0.024518195641879334,\n \"acc_norm\": 0.7581699346405228,\n \"acc_norm_stderr\": 0.024518195641879334\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7845659163987139,\n \"acc_stderr\": 0.023350225475471442,\n \"acc_norm\": 0.7845659163987139,\n \"acc_norm_stderr\": 0.023350225475471442\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.8425925925925926,\n \"acc_stderr\": 0.02026376499638572,\n \"acc_norm\": 0.8425925925925926,\n \"acc_norm_stderr\": 0.02026376499638572\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.5390070921985816,\n \"acc_stderr\": 0.02973659252642444,\n \"acc_norm\": 0.5390070921985816,\n \"acc_norm_stderr\": 0.02973659252642444\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5221642764015645,\n \"acc_stderr\": 0.01275768304771618,\n \"acc_norm\": 0.5221642764015645,\n \"acc_norm_stderr\": 0.01275768304771618\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.7757352941176471,\n \"acc_stderr\": 0.025336848563332372,\n \"acc_norm\": 0.7757352941176471,\n \"acc_norm_stderr\": 0.025336848563332372\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.7598039215686274,\n \"acc_stderr\": 0.01728276069516741,\n \"acc_norm\": 0.7598039215686274,\n \"acc_norm_stderr\": 0.01728276069516741\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7090909090909091,\n \"acc_stderr\": 0.04350271442923243,\n \"acc_norm\": 0.7090909090909091,\n \"acc_norm_stderr\": 0.04350271442923243\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7959183673469388,\n \"acc_stderr\": 0.0258012834750905,\n \"acc_norm\": 0.7959183673469388,\n \"acc_norm_stderr\": 0.0258012834750905\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8706467661691543,\n \"acc_stderr\": 0.023729830881018526,\n \"acc_norm\": 0.8706467661691543,\n \"acc_norm_stderr\": 0.023729830881018526\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.9,\n \"acc_stderr\": 0.03015113445777634,\n \"acc_norm\": 0.9,\n \"acc_norm_stderr\": 0.03015113445777634\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4879518072289157,\n \"acc_stderr\": 0.0389136449583582,\n \"acc_norm\": 0.4879518072289157,\n \"acc_norm_stderr\": 0.0389136449583582\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8596491228070176,\n \"acc_stderr\": 0.0266405825391332,\n \"acc_norm\": 0.8596491228070176,\n \"acc_norm_stderr\": 0.0266405825391332\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3378212974296206,\n \"mc1_stderr\": 0.016557167322516886,\n \"mc2\": 0.4847358578212202,\n \"mc2_stderr\": 0.014200020930273186\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8026835043409629,\n \"acc_stderr\": 0.011185026389050376\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.48142532221379836,\n \"acc_stderr\": 0.013762977910317584\n }\n}\n```", "repo_url": "https://huggingface.co/Zangs3011/mixtral_8x7b_MonsterInstruct", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_02T00_13_11.884306", "path": ["**/details_harness|arc:challenge|25_2024-02-02T00-13-11.884306.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-02T00-13-11.884306.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_02T00_13_11.884306", "path": ["**/details_harness|gsm8k|5_2024-02-02T00-13-11.884306.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-02T00-13-11.884306.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_02T00_13_11.884306", "path": ["**/details_harness|hellaswag|10_2024-02-02T00-13-11.884306.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-02T00-13-11.884306.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_02T00_13_11.884306", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T00-13-11.884306.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-02T00-13-11.884306.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-02T00-13-11.884306.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T00-13-11.884306.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T00-13-11.884306.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-02T00-13-11.884306.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T00-13-11.884306.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T00-13-11.884306.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T00-13-11.884306.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T00-13-11.884306.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-02T00-13-11.884306.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-02T00-13-11.884306.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T00-13-11.884306.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-02T00-13-11.884306.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T00-13-11.884306.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T00-13-11.884306.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T00-13-11.884306.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-02T00-13-11.884306.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T00-13-11.884306.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T00-13-11.884306.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T00-13-11.884306.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T00-13-11.884306.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T00-13-11.884306.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T00-13-11.884306.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T00-13-11.884306.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T00-13-11.884306.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T00-13-11.884306.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T00-13-11.884306.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T00-13-11.884306.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T00-13-11.884306.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T00-13-11.884306.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T00-13-11.884306.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-02T00-13-11.884306.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T00-13-11.884306.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-02T00-13-11.884306.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T00-13-11.884306.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T00-13-11.884306.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T00-13-11.884306.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-02T00-13-11.884306.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-02T00-13-11.884306.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T00-13-11.884306.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T00-13-11.884306.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T00-13-11.884306.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T00-13-11.884306.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-02T00-13-11.884306.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-02T00-13-11.884306.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-02T00-13-11.884306.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T00-13-11.884306.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-02T00-13-11.884306.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T00-13-11.884306.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T00-13-11.884306.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-02T00-13-11.884306.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-02T00-13-11.884306.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-02T00-13-11.884306.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T00-13-11.884306.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-02T00-13-11.884306.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-02T00-13-11.884306.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T00-13-11.884306.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-02T00-13-11.884306.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-02T00-13-11.884306.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T00-13-11.884306.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T00-13-11.884306.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-02T00-13-11.884306.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T00-13-11.884306.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T00-13-11.884306.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T00-13-11.884306.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T00-13-11.884306.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-02T00-13-11.884306.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-02T00-13-11.884306.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T00-13-11.884306.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-02T00-13-11.884306.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T00-13-11.884306.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T00-13-11.884306.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T00-13-11.884306.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-02T00-13-11.884306.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T00-13-11.884306.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T00-13-11.884306.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T00-13-11.884306.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T00-13-11.884306.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T00-13-11.884306.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T00-13-11.884306.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T00-13-11.884306.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T00-13-11.884306.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T00-13-11.884306.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T00-13-11.884306.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T00-13-11.884306.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T00-13-11.884306.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T00-13-11.884306.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T00-13-11.884306.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-02T00-13-11.884306.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T00-13-11.884306.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-02T00-13-11.884306.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T00-13-11.884306.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T00-13-11.884306.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T00-13-11.884306.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-02T00-13-11.884306.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-02T00-13-11.884306.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T00-13-11.884306.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T00-13-11.884306.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T00-13-11.884306.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T00-13-11.884306.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-02T00-13-11.884306.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-02T00-13-11.884306.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-02T00-13-11.884306.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T00-13-11.884306.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-02T00-13-11.884306.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T00-13-11.884306.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T00-13-11.884306.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-02T00-13-11.884306.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-02T00-13-11.884306.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-02T00-13-11.884306.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T00-13-11.884306.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-02T00-13-11.884306.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-02T00-13-11.884306.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_02T00_13_11.884306", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T00-13-11.884306.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T00-13-11.884306.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_02T00_13_11.884306", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-02T00-13-11.884306.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-02T00-13-11.884306.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_02T00_13_11.884306", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-02T00-13-11.884306.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-02T00-13-11.884306.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_02T00_13_11.884306", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T00-13-11.884306.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T00-13-11.884306.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_02T00_13_11.884306", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T00-13-11.884306.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T00-13-11.884306.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_02T00_13_11.884306", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-02T00-13-11.884306.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-02T00-13-11.884306.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_02T00_13_11.884306", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T00-13-11.884306.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T00-13-11.884306.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_02T00_13_11.884306", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T00-13-11.884306.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T00-13-11.884306.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_02T00_13_11.884306", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T00-13-11.884306.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T00-13-11.884306.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_02T00_13_11.884306", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T00-13-11.884306.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T00-13-11.884306.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_02T00_13_11.884306", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-02T00-13-11.884306.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-02T00-13-11.884306.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_02T00_13_11.884306", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-02T00-13-11.884306.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-02T00-13-11.884306.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_02T00_13_11.884306", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T00-13-11.884306.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T00-13-11.884306.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_02T00_13_11.884306", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-02T00-13-11.884306.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-02T00-13-11.884306.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_02T00_13_11.884306", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T00-13-11.884306.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T00-13-11.884306.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_02T00_13_11.884306", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T00-13-11.884306.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T00-13-11.884306.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_02T00_13_11.884306", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T00-13-11.884306.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T00-13-11.884306.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_02T00_13_11.884306", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-02T00-13-11.884306.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-02T00-13-11.884306.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_02T00_13_11.884306", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T00-13-11.884306.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T00-13-11.884306.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_02T00_13_11.884306", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T00-13-11.884306.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T00-13-11.884306.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_02T00_13_11.884306", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T00-13-11.884306.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T00-13-11.884306.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_02T00_13_11.884306", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T00-13-11.884306.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T00-13-11.884306.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_02T00_13_11.884306", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T00-13-11.884306.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T00-13-11.884306.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_02T00_13_11.884306", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T00-13-11.884306.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T00-13-11.884306.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_02T00_13_11.884306", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T00-13-11.884306.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T00-13-11.884306.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_02T00_13_11.884306", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T00-13-11.884306.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T00-13-11.884306.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_02T00_13_11.884306", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T00-13-11.884306.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T00-13-11.884306.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_02T00_13_11.884306", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T00-13-11.884306.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T00-13-11.884306.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_02T00_13_11.884306", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T00-13-11.884306.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T00-13-11.884306.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_02T00_13_11.884306", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T00-13-11.884306.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T00-13-11.884306.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_02T00_13_11.884306", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T00-13-11.884306.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T00-13-11.884306.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_02T00_13_11.884306", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T00-13-11.884306.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T00-13-11.884306.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_02T00_13_11.884306", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-02T00-13-11.884306.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-02T00-13-11.884306.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_02T00_13_11.884306", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T00-13-11.884306.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T00-13-11.884306.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_02T00_13_11.884306", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-02T00-13-11.884306.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-02T00-13-11.884306.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_02T00_13_11.884306", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T00-13-11.884306.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T00-13-11.884306.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_02T00_13_11.884306", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T00-13-11.884306.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T00-13-11.884306.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_02T00_13_11.884306", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T00-13-11.884306.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T00-13-11.884306.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_02T00_13_11.884306", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-02T00-13-11.884306.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-02T00-13-11.884306.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_02T00_13_11.884306", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-02T00-13-11.884306.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-02T00-13-11.884306.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_02T00_13_11.884306", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T00-13-11.884306.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T00-13-11.884306.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_02T00_13_11.884306", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T00-13-11.884306.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T00-13-11.884306.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_02T00_13_11.884306", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T00-13-11.884306.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T00-13-11.884306.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_02T00_13_11.884306", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T00-13-11.884306.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T00-13-11.884306.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_02T00_13_11.884306", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-02T00-13-11.884306.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-02T00-13-11.884306.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_02T00_13_11.884306", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-02T00-13-11.884306.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-02T00-13-11.884306.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_02T00_13_11.884306", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-02T00-13-11.884306.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-02T00-13-11.884306.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_02T00_13_11.884306", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T00-13-11.884306.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T00-13-11.884306.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_02T00_13_11.884306", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-02T00-13-11.884306.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-02T00-13-11.884306.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_02T00_13_11.884306", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T00-13-11.884306.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T00-13-11.884306.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_02T00_13_11.884306", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T00-13-11.884306.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T00-13-11.884306.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_02T00_13_11.884306", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-02T00-13-11.884306.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-02T00-13-11.884306.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_02T00_13_11.884306", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-02T00-13-11.884306.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-02T00-13-11.884306.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_02T00_13_11.884306", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-02T00-13-11.884306.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-02T00-13-11.884306.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_02T00_13_11.884306", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T00-13-11.884306.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T00-13-11.884306.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_02T00_13_11.884306", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-02T00-13-11.884306.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-02T00-13-11.884306.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_02T00_13_11.884306", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-02T00-13-11.884306.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-02T00-13-11.884306.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_02T00_13_11.884306", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-02T00-13-11.884306.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-02T00-13-11.884306.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_02T00_13_11.884306", "path": ["**/details_harness|winogrande|5_2024-02-02T00-13-11.884306.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-02T00-13-11.884306.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_02T00_13_11.884306", "path": ["results_2024-02-02T00-13-11.884306.parquet"]}, {"split": "latest", "path": ["results_2024-02-02T00-13-11.884306.parquet"]}]}]}
2024-02-02T00:15:54+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of Zangs3011/mixtral_8x7b_MonsterInstruct Dataset automatically created during the evaluation run of model Zangs3011/mixtral_8x7b_MonsterInstruct on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-02-02T00:13:11.884306(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of Zangs3011/mixtral_8x7b_MonsterInstruct\n\n\n\nDataset automatically created during the evaluation run of model Zangs3011/mixtral_8x7b_MonsterInstruct on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-02-02T00:13:11.884306(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of Zangs3011/mixtral_8x7b_MonsterInstruct\n\n\n\nDataset automatically created during the evaluation run of model Zangs3011/mixtral_8x7b_MonsterInstruct on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-02-02T00:13:11.884306(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
fda5b44fafbfdb7650c3e839e73f1fe21fd25389
# Dataset Card for Evaluation run of ewqr2130/llama_sft_longer <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [ewqr2130/llama_sft_longer](https://huggingface.co/ewqr2130/llama_sft_longer) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_ewqr2130__llama_sft_longer", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-02-02T00:16:03.547688](https://huggingface.co/datasets/open-llm-leaderboard/details_ewqr2130__llama_sft_longer/blob/main/results_2024-02-02T00-16-03.547688.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.4702872401479079, "acc_stderr": 0.03444115725941989, "acc_norm": 0.4753909268597203, "acc_norm_stderr": 0.03521794534722382, "mc1": 0.28518971848225216, "mc1_stderr": 0.015805827874454892, "mc2": 0.40815192782555304, "mc2_stderr": 0.014287904281831857 }, "harness|arc:challenge|25": { "acc": 0.49829351535836175, "acc_stderr": 0.014611305705056995, "acc_norm": 0.5477815699658704, "acc_norm_stderr": 0.014544519880633829 }, "harness|hellaswag|10": { "acc": 0.5866361282613025, "acc_stderr": 0.0049143057985756924, "acc_norm": 0.7857996415056762, "acc_norm_stderr": 0.00409427987173368 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.33, "acc_stderr": 0.047258156262526045, "acc_norm": 0.33, "acc_norm_stderr": 0.047258156262526045 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.4444444444444444, "acc_stderr": 0.04292596718256981, "acc_norm": 0.4444444444444444, "acc_norm_stderr": 0.04292596718256981 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.3815789473684211, "acc_stderr": 0.03953173377749194, "acc_norm": 0.3815789473684211, "acc_norm_stderr": 0.03953173377749194 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.52, "acc_stderr": 0.050211673156867795, "acc_norm": 0.52, "acc_norm_stderr": 0.050211673156867795 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.44528301886792454, "acc_stderr": 0.030588052974270655, "acc_norm": 0.44528301886792454, "acc_norm_stderr": 0.030588052974270655 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.4652777777777778, "acc_stderr": 0.04171115858181618, "acc_norm": 0.4652777777777778, "acc_norm_stderr": 0.04171115858181618 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.3, "acc_stderr": 0.046056618647183814, "acc_norm": 0.3, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.4, "acc_stderr": 0.049236596391733084, "acc_norm": 0.4, "acc_norm_stderr": 0.049236596391733084 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.3, "acc_stderr": 0.046056618647183814, "acc_norm": 0.3, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.43352601156069365, "acc_stderr": 0.03778621079092055, "acc_norm": 0.43352601156069365, "acc_norm_stderr": 0.03778621079092055 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.20588235294117646, "acc_stderr": 0.04023382273617746, "acc_norm": 0.20588235294117646, "acc_norm_stderr": 0.04023382273617746 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.6, "acc_stderr": 0.04923659639173309, "acc_norm": 0.6, "acc_norm_stderr": 0.04923659639173309 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.425531914893617, "acc_stderr": 0.03232146916224468, "acc_norm": 0.425531914893617, "acc_norm_stderr": 0.03232146916224468 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.32456140350877194, "acc_stderr": 0.04404556157374768, "acc_norm": 0.32456140350877194, "acc_norm_stderr": 0.04404556157374768 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.4689655172413793, "acc_stderr": 0.04158632762097828, "acc_norm": 0.4689655172413793, "acc_norm_stderr": 0.04158632762097828 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.2566137566137566, "acc_stderr": 0.022494510767503154, "acc_norm": 0.2566137566137566, "acc_norm_stderr": 0.022494510767503154 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.3492063492063492, "acc_stderr": 0.04263906892795132, "acc_norm": 0.3492063492063492, "acc_norm_stderr": 0.04263906892795132 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.34, "acc_stderr": 0.04760952285695235, "acc_norm": 0.34, "acc_norm_stderr": 0.04760952285695235 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.5, "acc_stderr": 0.028444006199428714, "acc_norm": 0.5, "acc_norm_stderr": 0.028444006199428714 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.37438423645320196, "acc_stderr": 0.03405155380561952, "acc_norm": 0.37438423645320196, "acc_norm_stderr": 0.03405155380561952 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.42, "acc_stderr": 0.049604496374885836, "acc_norm": 0.42, "acc_norm_stderr": 0.049604496374885836 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.6121212121212121, "acc_stderr": 0.038049136539710114, "acc_norm": 0.6121212121212121, "acc_norm_stderr": 0.038049136539710114 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.51010101010101, "acc_stderr": 0.035616254886737454, "acc_norm": 0.51010101010101, "acc_norm_stderr": 0.035616254886737454 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.6632124352331606, "acc_stderr": 0.03410780251836184, "acc_norm": 0.6632124352331606, "acc_norm_stderr": 0.03410780251836184 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.441025641025641, "acc_stderr": 0.025174048384000752, "acc_norm": 0.441025641025641, "acc_norm_stderr": 0.025174048384000752 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.2740740740740741, "acc_stderr": 0.027195934804085626, "acc_norm": 0.2740740740740741, "acc_norm_stderr": 0.027195934804085626 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.40336134453781514, "acc_stderr": 0.031866081214088314, "acc_norm": 0.40336134453781514, "acc_norm_stderr": 0.031866081214088314 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.2913907284768212, "acc_stderr": 0.03710185726119995, "acc_norm": 0.2913907284768212, "acc_norm_stderr": 0.03710185726119995 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.6238532110091743, "acc_stderr": 0.02076923196820508, "acc_norm": 0.6238532110091743, "acc_norm_stderr": 0.02076923196820508 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.25462962962962965, "acc_stderr": 0.029711275860005357, "acc_norm": 0.25462962962962965, "acc_norm_stderr": 0.029711275860005357 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.5882352941176471, "acc_stderr": 0.03454236585380609, "acc_norm": 0.5882352941176471, "acc_norm_stderr": 0.03454236585380609 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.6413502109704642, "acc_stderr": 0.03121956944530185, "acc_norm": 0.6413502109704642, "acc_norm_stderr": 0.03121956944530185 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.5695067264573991, "acc_stderr": 0.033231973029429394, "acc_norm": 0.5695067264573991, "acc_norm_stderr": 0.033231973029429394 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.5267175572519084, "acc_stderr": 0.04379024936553894, "acc_norm": 0.5267175572519084, "acc_norm_stderr": 0.04379024936553894 }, "harness|hendrycksTest-international_law|5": { "acc": 0.628099173553719, "acc_stderr": 0.044120158066245044, "acc_norm": 0.628099173553719, "acc_norm_stderr": 0.044120158066245044 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.5277777777777778, "acc_stderr": 0.048262172941398944, "acc_norm": 0.5277777777777778, "acc_norm_stderr": 0.048262172941398944 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.5214723926380368, "acc_stderr": 0.03924746876751129, "acc_norm": 0.5214723926380368, "acc_norm_stderr": 0.03924746876751129 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.4017857142857143, "acc_stderr": 0.04653333146973646, "acc_norm": 0.4017857142857143, "acc_norm_stderr": 0.04653333146973646 }, "harness|hendrycksTest-management|5": { "acc": 0.5339805825242718, "acc_stderr": 0.0493929144727348, "acc_norm": 0.5339805825242718, "acc_norm_stderr": 0.0493929144727348 }, "harness|hendrycksTest-marketing|5": { "acc": 0.7435897435897436, "acc_stderr": 0.028605953702004236, "acc_norm": 0.7435897435897436, "acc_norm_stderr": 0.028605953702004236 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.52, "acc_stderr": 0.050211673156867795, "acc_norm": 0.52, "acc_norm_stderr": 0.050211673156867795 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.6500638569604087, "acc_stderr": 0.017055679797150426, "acc_norm": 0.6500638569604087, "acc_norm_stderr": 0.017055679797150426 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.5173410404624278, "acc_stderr": 0.02690290045866664, "acc_norm": 0.5173410404624278, "acc_norm_stderr": 0.02690290045866664 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.2636871508379888, "acc_stderr": 0.01473692638376198, "acc_norm": 0.2636871508379888, "acc_norm_stderr": 0.01473692638376198 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.49019607843137253, "acc_stderr": 0.028624412550167958, "acc_norm": 0.49019607843137253, "acc_norm_stderr": 0.028624412550167958 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.6109324758842444, "acc_stderr": 0.027690337536485372, "acc_norm": 0.6109324758842444, "acc_norm_stderr": 0.027690337536485372 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.5277777777777778, "acc_stderr": 0.027777777777777797, "acc_norm": 0.5277777777777778, "acc_norm_stderr": 0.027777777777777797 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.3546099290780142, "acc_stderr": 0.028538650028878638, "acc_norm": 0.3546099290780142, "acc_norm_stderr": 0.028538650028878638 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.35267275097783574, "acc_stderr": 0.012203286846053887, "acc_norm": 0.35267275097783574, "acc_norm_stderr": 0.012203286846053887 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.48161764705882354, "acc_stderr": 0.03035230339535196, "acc_norm": 0.48161764705882354, "acc_norm_stderr": 0.03035230339535196 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.4624183006535948, "acc_stderr": 0.02017061497496977, "acc_norm": 0.4624183006535948, "acc_norm_stderr": 0.02017061497496977 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.5636363636363636, "acc_stderr": 0.04750185058907297, "acc_norm": 0.5636363636363636, "acc_norm_stderr": 0.04750185058907297 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.45714285714285713, "acc_stderr": 0.031891418324213966, "acc_norm": 0.45714285714285713, "acc_norm_stderr": 0.031891418324213966 }, "harness|hendrycksTest-sociology|5": { "acc": 0.6467661691542289, "acc_stderr": 0.03379790611796777, "acc_norm": 0.6467661691542289, "acc_norm_stderr": 0.03379790611796777 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.71, "acc_stderr": 0.045604802157206845, "acc_norm": 0.71, "acc_norm_stderr": 0.045604802157206845 }, "harness|hendrycksTest-virology|5": { "acc": 0.39156626506024095, "acc_stderr": 0.03799857454479637, "acc_norm": 0.39156626506024095, "acc_norm_stderr": 0.03799857454479637 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.672514619883041, "acc_stderr": 0.035993357714560276, "acc_norm": 0.672514619883041, "acc_norm_stderr": 0.035993357714560276 }, "harness|truthfulqa:mc|0": { "mc1": 0.28518971848225216, "mc1_stderr": 0.015805827874454892, "mc2": 0.40815192782555304, "mc2_stderr": 0.014287904281831857 }, "harness|winogrande|5": { "acc": 0.7387529597474349, "acc_stderr": 0.012346914863415305 }, "harness|gsm8k|5": { "acc": 0.14935557240333586, "acc_stderr": 0.009818090723727286 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_ewqr2130__llama_sft_longer
[ "region:us" ]
2024-02-02T00:18:26+00:00
{"pretty_name": "Evaluation run of ewqr2130/llama_sft_longer", "dataset_summary": "Dataset automatically created during the evaluation run of model [ewqr2130/llama_sft_longer](https://huggingface.co/ewqr2130/llama_sft_longer) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_ewqr2130__llama_sft_longer\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-02T00:16:03.547688](https://huggingface.co/datasets/open-llm-leaderboard/details_ewqr2130__llama_sft_longer/blob/main/results_2024-02-02T00-16-03.547688.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.4702872401479079,\n \"acc_stderr\": 0.03444115725941989,\n \"acc_norm\": 0.4753909268597203,\n \"acc_norm_stderr\": 0.03521794534722382,\n \"mc1\": 0.28518971848225216,\n \"mc1_stderr\": 0.015805827874454892,\n \"mc2\": 0.40815192782555304,\n \"mc2_stderr\": 0.014287904281831857\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.49829351535836175,\n \"acc_stderr\": 0.014611305705056995,\n \"acc_norm\": 0.5477815699658704,\n \"acc_norm_stderr\": 0.014544519880633829\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5866361282613025,\n \"acc_stderr\": 0.0049143057985756924,\n \"acc_norm\": 0.7857996415056762,\n \"acc_norm_stderr\": 0.00409427987173368\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4444444444444444,\n \"acc_stderr\": 0.04292596718256981,\n \"acc_norm\": 0.4444444444444444,\n \"acc_norm_stderr\": 0.04292596718256981\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.3815789473684211,\n \"acc_stderr\": 0.03953173377749194,\n \"acc_norm\": 0.3815789473684211,\n \"acc_norm_stderr\": 0.03953173377749194\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.44528301886792454,\n \"acc_stderr\": 0.030588052974270655,\n \"acc_norm\": 0.44528301886792454,\n \"acc_norm_stderr\": 0.030588052974270655\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4652777777777778,\n \"acc_stderr\": 0.04171115858181618,\n \"acc_norm\": 0.4652777777777778,\n \"acc_norm_stderr\": 0.04171115858181618\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.43352601156069365,\n \"acc_stderr\": 0.03778621079092055,\n \"acc_norm\": 0.43352601156069365,\n \"acc_norm_stderr\": 0.03778621079092055\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.20588235294117646,\n \"acc_stderr\": 0.04023382273617746,\n \"acc_norm\": 0.20588235294117646,\n \"acc_norm_stderr\": 0.04023382273617746\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.425531914893617,\n \"acc_stderr\": 0.03232146916224468,\n \"acc_norm\": 0.425531914893617,\n \"acc_norm_stderr\": 0.03232146916224468\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.32456140350877194,\n \"acc_stderr\": 0.04404556157374768,\n \"acc_norm\": 0.32456140350877194,\n \"acc_norm_stderr\": 0.04404556157374768\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.4689655172413793,\n \"acc_stderr\": 0.04158632762097828,\n \"acc_norm\": 0.4689655172413793,\n \"acc_norm_stderr\": 0.04158632762097828\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.2566137566137566,\n \"acc_stderr\": 0.022494510767503154,\n \"acc_norm\": 0.2566137566137566,\n \"acc_norm_stderr\": 0.022494510767503154\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3492063492063492,\n \"acc_stderr\": 0.04263906892795132,\n \"acc_norm\": 0.3492063492063492,\n \"acc_norm_stderr\": 0.04263906892795132\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.028444006199428714,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.028444006199428714\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.37438423645320196,\n \"acc_stderr\": 0.03405155380561952,\n \"acc_norm\": 0.37438423645320196,\n \"acc_norm_stderr\": 0.03405155380561952\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.6121212121212121,\n \"acc_stderr\": 0.038049136539710114,\n \"acc_norm\": 0.6121212121212121,\n \"acc_norm_stderr\": 0.038049136539710114\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.51010101010101,\n \"acc_stderr\": 0.035616254886737454,\n \"acc_norm\": 0.51010101010101,\n \"acc_norm_stderr\": 0.035616254886737454\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.6632124352331606,\n \"acc_stderr\": 0.03410780251836184,\n \"acc_norm\": 0.6632124352331606,\n \"acc_norm_stderr\": 0.03410780251836184\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.441025641025641,\n \"acc_stderr\": 0.025174048384000752,\n \"acc_norm\": 0.441025641025641,\n \"acc_norm_stderr\": 0.025174048384000752\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.2740740740740741,\n \"acc_stderr\": 0.027195934804085626,\n \"acc_norm\": 0.2740740740740741,\n \"acc_norm_stderr\": 0.027195934804085626\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.40336134453781514,\n \"acc_stderr\": 0.031866081214088314,\n \"acc_norm\": 0.40336134453781514,\n \"acc_norm_stderr\": 0.031866081214088314\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.2913907284768212,\n \"acc_stderr\": 0.03710185726119995,\n \"acc_norm\": 0.2913907284768212,\n \"acc_norm_stderr\": 0.03710185726119995\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.6238532110091743,\n \"acc_stderr\": 0.02076923196820508,\n \"acc_norm\": 0.6238532110091743,\n \"acc_norm_stderr\": 0.02076923196820508\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.25462962962962965,\n \"acc_stderr\": 0.029711275860005357,\n \"acc_norm\": 0.25462962962962965,\n \"acc_norm_stderr\": 0.029711275860005357\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.5882352941176471,\n \"acc_stderr\": 0.03454236585380609,\n \"acc_norm\": 0.5882352941176471,\n \"acc_norm_stderr\": 0.03454236585380609\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.6413502109704642,\n \"acc_stderr\": 0.03121956944530185,\n \"acc_norm\": 0.6413502109704642,\n \"acc_norm_stderr\": 0.03121956944530185\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5695067264573991,\n \"acc_stderr\": 0.033231973029429394,\n \"acc_norm\": 0.5695067264573991,\n \"acc_norm_stderr\": 0.033231973029429394\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.5267175572519084,\n \"acc_stderr\": 0.04379024936553894,\n \"acc_norm\": 0.5267175572519084,\n \"acc_norm_stderr\": 0.04379024936553894\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.628099173553719,\n \"acc_stderr\": 0.044120158066245044,\n \"acc_norm\": 0.628099173553719,\n \"acc_norm_stderr\": 0.044120158066245044\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5277777777777778,\n \"acc_stderr\": 0.048262172941398944,\n \"acc_norm\": 0.5277777777777778,\n \"acc_norm_stderr\": 0.048262172941398944\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.5214723926380368,\n \"acc_stderr\": 0.03924746876751129,\n \"acc_norm\": 0.5214723926380368,\n \"acc_norm_stderr\": 0.03924746876751129\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4017857142857143,\n \"acc_stderr\": 0.04653333146973646,\n \"acc_norm\": 0.4017857142857143,\n \"acc_norm_stderr\": 0.04653333146973646\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.5339805825242718,\n \"acc_stderr\": 0.0493929144727348,\n \"acc_norm\": 0.5339805825242718,\n \"acc_norm_stderr\": 0.0493929144727348\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7435897435897436,\n \"acc_stderr\": 0.028605953702004236,\n \"acc_norm\": 0.7435897435897436,\n \"acc_norm_stderr\": 0.028605953702004236\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6500638569604087,\n \"acc_stderr\": 0.017055679797150426,\n \"acc_norm\": 0.6500638569604087,\n \"acc_norm_stderr\": 0.017055679797150426\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.5173410404624278,\n \"acc_stderr\": 0.02690290045866664,\n \"acc_norm\": 0.5173410404624278,\n \"acc_norm_stderr\": 0.02690290045866664\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2636871508379888,\n \"acc_stderr\": 0.01473692638376198,\n \"acc_norm\": 0.2636871508379888,\n \"acc_norm_stderr\": 0.01473692638376198\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.49019607843137253,\n \"acc_stderr\": 0.028624412550167958,\n \"acc_norm\": 0.49019607843137253,\n \"acc_norm_stderr\": 0.028624412550167958\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6109324758842444,\n \"acc_stderr\": 0.027690337536485372,\n \"acc_norm\": 0.6109324758842444,\n \"acc_norm_stderr\": 0.027690337536485372\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.5277777777777778,\n \"acc_stderr\": 0.027777777777777797,\n \"acc_norm\": 0.5277777777777778,\n \"acc_norm_stderr\": 0.027777777777777797\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.3546099290780142,\n \"acc_stderr\": 0.028538650028878638,\n \"acc_norm\": 0.3546099290780142,\n \"acc_norm_stderr\": 0.028538650028878638\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.35267275097783574,\n \"acc_stderr\": 0.012203286846053887,\n \"acc_norm\": 0.35267275097783574,\n \"acc_norm_stderr\": 0.012203286846053887\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.48161764705882354,\n \"acc_stderr\": 0.03035230339535196,\n \"acc_norm\": 0.48161764705882354,\n \"acc_norm_stderr\": 0.03035230339535196\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.4624183006535948,\n \"acc_stderr\": 0.02017061497496977,\n \"acc_norm\": 0.4624183006535948,\n \"acc_norm_stderr\": 0.02017061497496977\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5636363636363636,\n \"acc_stderr\": 0.04750185058907297,\n \"acc_norm\": 0.5636363636363636,\n \"acc_norm_stderr\": 0.04750185058907297\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.45714285714285713,\n \"acc_stderr\": 0.031891418324213966,\n \"acc_norm\": 0.45714285714285713,\n \"acc_norm_stderr\": 0.031891418324213966\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6467661691542289,\n \"acc_stderr\": 0.03379790611796777,\n \"acc_norm\": 0.6467661691542289,\n \"acc_norm_stderr\": 0.03379790611796777\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.39156626506024095,\n \"acc_stderr\": 0.03799857454479637,\n \"acc_norm\": 0.39156626506024095,\n \"acc_norm_stderr\": 0.03799857454479637\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.672514619883041,\n \"acc_stderr\": 0.035993357714560276,\n \"acc_norm\": 0.672514619883041,\n \"acc_norm_stderr\": 0.035993357714560276\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.28518971848225216,\n \"mc1_stderr\": 0.015805827874454892,\n \"mc2\": 0.40815192782555304,\n \"mc2_stderr\": 0.014287904281831857\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7387529597474349,\n \"acc_stderr\": 0.012346914863415305\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.14935557240333586,\n \"acc_stderr\": 0.009818090723727286\n }\n}\n```", "repo_url": "https://huggingface.co/ewqr2130/llama_sft_longer", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_02T00_16_03.547688", "path": ["**/details_harness|arc:challenge|25_2024-02-02T00-16-03.547688.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-02T00-16-03.547688.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_02T00_16_03.547688", "path": ["**/details_harness|gsm8k|5_2024-02-02T00-16-03.547688.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-02T00-16-03.547688.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_02T00_16_03.547688", "path": ["**/details_harness|hellaswag|10_2024-02-02T00-16-03.547688.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-02T00-16-03.547688.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_02T00_16_03.547688", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T00-16-03.547688.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-02T00-16-03.547688.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-02T00-16-03.547688.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T00-16-03.547688.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T00-16-03.547688.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-02T00-16-03.547688.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T00-16-03.547688.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T00-16-03.547688.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T00-16-03.547688.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T00-16-03.547688.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-02T00-16-03.547688.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-02T00-16-03.547688.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T00-16-03.547688.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-02T00-16-03.547688.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T00-16-03.547688.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T00-16-03.547688.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T00-16-03.547688.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-02T00-16-03.547688.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T00-16-03.547688.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T00-16-03.547688.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T00-16-03.547688.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T00-16-03.547688.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T00-16-03.547688.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T00-16-03.547688.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T00-16-03.547688.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T00-16-03.547688.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T00-16-03.547688.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T00-16-03.547688.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T00-16-03.547688.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T00-16-03.547688.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T00-16-03.547688.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T00-16-03.547688.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-02T00-16-03.547688.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T00-16-03.547688.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-02T00-16-03.547688.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T00-16-03.547688.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T00-16-03.547688.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T00-16-03.547688.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-02T00-16-03.547688.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-02T00-16-03.547688.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T00-16-03.547688.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T00-16-03.547688.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T00-16-03.547688.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T00-16-03.547688.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-02T00-16-03.547688.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-02T00-16-03.547688.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-02T00-16-03.547688.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T00-16-03.547688.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-02T00-16-03.547688.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T00-16-03.547688.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T00-16-03.547688.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-02T00-16-03.547688.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-02T00-16-03.547688.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-02T00-16-03.547688.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T00-16-03.547688.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-02T00-16-03.547688.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-02T00-16-03.547688.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T00-16-03.547688.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-02T00-16-03.547688.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-02T00-16-03.547688.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T00-16-03.547688.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T00-16-03.547688.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-02T00-16-03.547688.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T00-16-03.547688.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T00-16-03.547688.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T00-16-03.547688.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T00-16-03.547688.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-02T00-16-03.547688.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-02T00-16-03.547688.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T00-16-03.547688.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-02T00-16-03.547688.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T00-16-03.547688.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T00-16-03.547688.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T00-16-03.547688.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-02T00-16-03.547688.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T00-16-03.547688.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T00-16-03.547688.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T00-16-03.547688.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T00-16-03.547688.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T00-16-03.547688.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T00-16-03.547688.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T00-16-03.547688.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T00-16-03.547688.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T00-16-03.547688.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T00-16-03.547688.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T00-16-03.547688.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T00-16-03.547688.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T00-16-03.547688.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T00-16-03.547688.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-02T00-16-03.547688.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T00-16-03.547688.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-02T00-16-03.547688.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T00-16-03.547688.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T00-16-03.547688.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T00-16-03.547688.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-02T00-16-03.547688.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-02T00-16-03.547688.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T00-16-03.547688.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T00-16-03.547688.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T00-16-03.547688.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T00-16-03.547688.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-02T00-16-03.547688.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-02T00-16-03.547688.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-02T00-16-03.547688.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T00-16-03.547688.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-02T00-16-03.547688.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T00-16-03.547688.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T00-16-03.547688.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-02T00-16-03.547688.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-02T00-16-03.547688.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-02T00-16-03.547688.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T00-16-03.547688.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-02T00-16-03.547688.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-02T00-16-03.547688.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_02T00_16_03.547688", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T00-16-03.547688.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T00-16-03.547688.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_02T00_16_03.547688", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-02T00-16-03.547688.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-02T00-16-03.547688.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_02T00_16_03.547688", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-02T00-16-03.547688.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-02T00-16-03.547688.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_02T00_16_03.547688", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T00-16-03.547688.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T00-16-03.547688.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_02T00_16_03.547688", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T00-16-03.547688.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T00-16-03.547688.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_02T00_16_03.547688", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-02T00-16-03.547688.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-02T00-16-03.547688.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_02T00_16_03.547688", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T00-16-03.547688.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T00-16-03.547688.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_02T00_16_03.547688", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T00-16-03.547688.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T00-16-03.547688.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_02T00_16_03.547688", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T00-16-03.547688.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T00-16-03.547688.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_02T00_16_03.547688", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T00-16-03.547688.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T00-16-03.547688.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_02T00_16_03.547688", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-02T00-16-03.547688.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-02T00-16-03.547688.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_02T00_16_03.547688", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-02T00-16-03.547688.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-02T00-16-03.547688.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_02T00_16_03.547688", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T00-16-03.547688.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T00-16-03.547688.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_02T00_16_03.547688", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-02T00-16-03.547688.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-02T00-16-03.547688.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_02T00_16_03.547688", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T00-16-03.547688.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T00-16-03.547688.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_02T00_16_03.547688", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T00-16-03.547688.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T00-16-03.547688.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_02T00_16_03.547688", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T00-16-03.547688.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T00-16-03.547688.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_02T00_16_03.547688", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-02T00-16-03.547688.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-02T00-16-03.547688.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_02T00_16_03.547688", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T00-16-03.547688.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T00-16-03.547688.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_02T00_16_03.547688", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T00-16-03.547688.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T00-16-03.547688.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_02T00_16_03.547688", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T00-16-03.547688.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T00-16-03.547688.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_02T00_16_03.547688", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T00-16-03.547688.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T00-16-03.547688.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_02T00_16_03.547688", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T00-16-03.547688.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T00-16-03.547688.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_02T00_16_03.547688", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T00-16-03.547688.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T00-16-03.547688.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_02T00_16_03.547688", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T00-16-03.547688.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T00-16-03.547688.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_02T00_16_03.547688", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T00-16-03.547688.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T00-16-03.547688.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_02T00_16_03.547688", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T00-16-03.547688.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T00-16-03.547688.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_02T00_16_03.547688", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T00-16-03.547688.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T00-16-03.547688.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_02T00_16_03.547688", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T00-16-03.547688.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T00-16-03.547688.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_02T00_16_03.547688", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T00-16-03.547688.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T00-16-03.547688.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_02T00_16_03.547688", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T00-16-03.547688.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T00-16-03.547688.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_02T00_16_03.547688", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T00-16-03.547688.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T00-16-03.547688.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_02T00_16_03.547688", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-02T00-16-03.547688.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-02T00-16-03.547688.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_02T00_16_03.547688", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T00-16-03.547688.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T00-16-03.547688.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_02T00_16_03.547688", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-02T00-16-03.547688.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-02T00-16-03.547688.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_02T00_16_03.547688", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T00-16-03.547688.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T00-16-03.547688.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_02T00_16_03.547688", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T00-16-03.547688.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T00-16-03.547688.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_02T00_16_03.547688", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T00-16-03.547688.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T00-16-03.547688.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_02T00_16_03.547688", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-02T00-16-03.547688.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-02T00-16-03.547688.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_02T00_16_03.547688", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-02T00-16-03.547688.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-02T00-16-03.547688.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_02T00_16_03.547688", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T00-16-03.547688.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T00-16-03.547688.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_02T00_16_03.547688", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T00-16-03.547688.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T00-16-03.547688.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_02T00_16_03.547688", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T00-16-03.547688.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T00-16-03.547688.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_02T00_16_03.547688", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T00-16-03.547688.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T00-16-03.547688.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_02T00_16_03.547688", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-02T00-16-03.547688.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-02T00-16-03.547688.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_02T00_16_03.547688", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-02T00-16-03.547688.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-02T00-16-03.547688.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_02T00_16_03.547688", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-02T00-16-03.547688.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-02T00-16-03.547688.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_02T00_16_03.547688", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T00-16-03.547688.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T00-16-03.547688.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_02T00_16_03.547688", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-02T00-16-03.547688.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-02T00-16-03.547688.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_02T00_16_03.547688", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T00-16-03.547688.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T00-16-03.547688.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_02T00_16_03.547688", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T00-16-03.547688.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T00-16-03.547688.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_02T00_16_03.547688", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-02T00-16-03.547688.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-02T00-16-03.547688.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_02T00_16_03.547688", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-02T00-16-03.547688.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-02T00-16-03.547688.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_02T00_16_03.547688", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-02T00-16-03.547688.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-02T00-16-03.547688.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_02T00_16_03.547688", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T00-16-03.547688.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T00-16-03.547688.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_02T00_16_03.547688", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-02T00-16-03.547688.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-02T00-16-03.547688.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_02T00_16_03.547688", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-02T00-16-03.547688.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-02T00-16-03.547688.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_02T00_16_03.547688", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-02T00-16-03.547688.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-02T00-16-03.547688.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_02T00_16_03.547688", "path": ["**/details_harness|winogrande|5_2024-02-02T00-16-03.547688.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-02T00-16-03.547688.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_02T00_16_03.547688", "path": ["results_2024-02-02T00-16-03.547688.parquet"]}, {"split": "latest", "path": ["results_2024-02-02T00-16-03.547688.parquet"]}]}]}
2024-02-02T00:19:08+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of ewqr2130/llama_sft_longer Dataset automatically created during the evaluation run of model ewqr2130/llama_sft_longer on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-02-02T00:16:03.547688(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of ewqr2130/llama_sft_longer\n\n\n\nDataset automatically created during the evaluation run of model ewqr2130/llama_sft_longer on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-02-02T00:16:03.547688(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of ewqr2130/llama_sft_longer\n\n\n\nDataset automatically created during the evaluation run of model ewqr2130/llama_sft_longer on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-02-02T00:16:03.547688(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
f9fc59fdb877575c4df13ea5d71e6ab15e3874ed
# Dataset Card for Evaluation run of Charlie911/MultiLoRA-mmlu <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [Charlie911/MultiLoRA-mmlu](https://huggingface.co/Charlie911/MultiLoRA-mmlu) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_Charlie911__MultiLoRA-mmlu", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-02-02T00:16:37.919745](https://huggingface.co/datasets/open-llm-leaderboard/details_Charlie911__MultiLoRA-mmlu/blob/main/results_2024-02-02T00-16-37.919745.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.49682393519091933, "acc_stderr": 0.034234549326781244, "acc_norm": 0.5024513039750834, "acc_norm_stderr": 0.03499839555197597, "mc1": 0.3268053855569155, "mc1_stderr": 0.01641987473113503, "mc2": 0.5020042655317223, "mc2_stderr": 0.015409102519026984 }, "harness|arc:challenge|25": { "acc": 0.49658703071672355, "acc_stderr": 0.014611050403244077, "acc_norm": 0.5238907849829352, "acc_norm_stderr": 0.014594701798071654 }, "harness|hellaswag|10": { "acc": 0.5803624775941048, "acc_stderr": 0.004924910433106353, "acc_norm": 0.7720573590918144, "acc_norm_stderr": 0.0041864806453155625 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.32, "acc_stderr": 0.046882617226215034, "acc_norm": 0.32, "acc_norm_stderr": 0.046882617226215034 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.45185185185185184, "acc_stderr": 0.04299268905480864, "acc_norm": 0.45185185185185184, "acc_norm_stderr": 0.04299268905480864 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.506578947368421, "acc_stderr": 0.040685900502249704, "acc_norm": 0.506578947368421, "acc_norm_stderr": 0.040685900502249704 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.51, "acc_stderr": 0.05024183937956912, "acc_norm": 0.51, "acc_norm_stderr": 0.05024183937956912 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.5169811320754717, "acc_stderr": 0.030755120364119905, "acc_norm": 0.5169811320754717, "acc_norm_stderr": 0.030755120364119905 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.5, "acc_stderr": 0.04181210050035455, "acc_norm": 0.5, "acc_norm_stderr": 0.04181210050035455 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.29, "acc_stderr": 0.04560480215720684, "acc_norm": 0.29, "acc_norm_stderr": 0.04560480215720684 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.4, "acc_stderr": 0.049236596391733084, "acc_norm": 0.4, "acc_norm_stderr": 0.049236596391733084 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.4, "acc_stderr": 0.049236596391733084, "acc_norm": 0.4, "acc_norm_stderr": 0.049236596391733084 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.43352601156069365, "acc_stderr": 0.03778621079092055, "acc_norm": 0.43352601156069365, "acc_norm_stderr": 0.03778621079092055 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.16666666666666666, "acc_stderr": 0.03708284662416542, "acc_norm": 0.16666666666666666, "acc_norm_stderr": 0.03708284662416542 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.65, "acc_stderr": 0.047937248544110196, "acc_norm": 0.65, "acc_norm_stderr": 0.047937248544110196 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.43829787234042555, "acc_stderr": 0.032436186361081004, "acc_norm": 0.43829787234042555, "acc_norm_stderr": 0.032436186361081004 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.2894736842105263, "acc_stderr": 0.04266339443159393, "acc_norm": 0.2894736842105263, "acc_norm_stderr": 0.04266339443159393 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.4689655172413793, "acc_stderr": 0.04158632762097828, "acc_norm": 0.4689655172413793, "acc_norm_stderr": 0.04158632762097828 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.2857142857142857, "acc_stderr": 0.023266512213730578, "acc_norm": 0.2857142857142857, "acc_norm_stderr": 0.023266512213730578 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.3253968253968254, "acc_stderr": 0.04190596438871136, "acc_norm": 0.3253968253968254, "acc_norm_stderr": 0.04190596438871136 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.31, "acc_stderr": 0.04648231987117316, "acc_norm": 0.31, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.535483870967742, "acc_stderr": 0.02837228779796293, "acc_norm": 0.535483870967742, "acc_norm_stderr": 0.02837228779796293 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.39901477832512317, "acc_stderr": 0.03445487686264715, "acc_norm": 0.39901477832512317, "acc_norm_stderr": 0.03445487686264715 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.46, "acc_stderr": 0.05009082659620332, "acc_norm": 0.46, "acc_norm_stderr": 0.05009082659620332 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.6303030303030303, "acc_stderr": 0.03769430314512567, "acc_norm": 0.6303030303030303, "acc_norm_stderr": 0.03769430314512567 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.5909090909090909, "acc_stderr": 0.03502975799413007, "acc_norm": 0.5909090909090909, "acc_norm_stderr": 0.03502975799413007 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.7046632124352331, "acc_stderr": 0.03292296639155141, "acc_norm": 0.7046632124352331, "acc_norm_stderr": 0.03292296639155141 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.4717948717948718, "acc_stderr": 0.025310639254933893, "acc_norm": 0.4717948717948718, "acc_norm_stderr": 0.025310639254933893 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.24074074074074073, "acc_stderr": 0.026067159222275805, "acc_norm": 0.24074074074074073, "acc_norm_stderr": 0.026067159222275805 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.4369747899159664, "acc_stderr": 0.03221943636566196, "acc_norm": 0.4369747899159664, "acc_norm_stderr": 0.03221943636566196 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.271523178807947, "acc_stderr": 0.03631329803969653, "acc_norm": 0.271523178807947, "acc_norm_stderr": 0.03631329803969653 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.6770642201834862, "acc_stderr": 0.02004811592341531, "acc_norm": 0.6770642201834862, "acc_norm_stderr": 0.02004811592341531 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.39351851851851855, "acc_stderr": 0.03331747876370312, "acc_norm": 0.39351851851851855, "acc_norm_stderr": 0.03331747876370312 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.696078431372549, "acc_stderr": 0.032282103870378935, "acc_norm": 0.696078431372549, "acc_norm_stderr": 0.032282103870378935 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.6877637130801688, "acc_stderr": 0.03016513786784701, "acc_norm": 0.6877637130801688, "acc_norm_stderr": 0.03016513786784701 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.5560538116591929, "acc_stderr": 0.03334625674242728, "acc_norm": 0.5560538116591929, "acc_norm_stderr": 0.03334625674242728 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.5877862595419847, "acc_stderr": 0.04317171194870254, "acc_norm": 0.5877862595419847, "acc_norm_stderr": 0.04317171194870254 }, "harness|hendrycksTest-international_law|5": { "acc": 0.5537190082644629, "acc_stderr": 0.0453793517794788, "acc_norm": 0.5537190082644629, "acc_norm_stderr": 0.0453793517794788 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.5648148148148148, "acc_stderr": 0.04792898170907061, "acc_norm": 0.5648148148148148, "acc_norm_stderr": 0.04792898170907061 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.5337423312883436, "acc_stderr": 0.039194155450484096, "acc_norm": 0.5337423312883436, "acc_norm_stderr": 0.039194155450484096 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.4375, "acc_stderr": 0.04708567521880525, "acc_norm": 0.4375, "acc_norm_stderr": 0.04708567521880525 }, "harness|hendrycksTest-management|5": { "acc": 0.6699029126213593, "acc_stderr": 0.0465614711001235, "acc_norm": 0.6699029126213593, "acc_norm_stderr": 0.0465614711001235 }, "harness|hendrycksTest-marketing|5": { "acc": 0.7692307692307693, "acc_stderr": 0.0276019213814176, "acc_norm": 0.7692307692307693, "acc_norm_stderr": 0.0276019213814176 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.54, "acc_stderr": 0.05009082659620332, "acc_norm": 0.54, "acc_norm_stderr": 0.05009082659620332 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.669220945083014, "acc_stderr": 0.01682481846256376, "acc_norm": 0.669220945083014, "acc_norm_stderr": 0.01682481846256376 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.546242774566474, "acc_stderr": 0.02680372058320617, "acc_norm": 0.546242774566474, "acc_norm_stderr": 0.02680372058320617 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.24022346368715083, "acc_stderr": 0.014288343803925293, "acc_norm": 0.24022346368715083, "acc_norm_stderr": 0.014288343803925293 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.5359477124183006, "acc_stderr": 0.028555827516528784, "acc_norm": 0.5359477124183006, "acc_norm_stderr": 0.028555827516528784 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.5787781350482315, "acc_stderr": 0.02804339985821063, "acc_norm": 0.5787781350482315, "acc_norm_stderr": 0.02804339985821063 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.558641975308642, "acc_stderr": 0.027628737155668767, "acc_norm": 0.558641975308642, "acc_norm_stderr": 0.027628737155668767 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.36524822695035464, "acc_stderr": 0.02872386385328128, "acc_norm": 0.36524822695035464, "acc_norm_stderr": 0.02872386385328128 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.363754889178618, "acc_stderr": 0.012286991879902887, "acc_norm": 0.363754889178618, "acc_norm_stderr": 0.012286991879902887 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.5404411764705882, "acc_stderr": 0.03027332507734575, "acc_norm": 0.5404411764705882, "acc_norm_stderr": 0.03027332507734575 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.4869281045751634, "acc_stderr": 0.020220920829626916, "acc_norm": 0.4869281045751634, "acc_norm_stderr": 0.020220920829626916 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.5727272727272728, "acc_stderr": 0.04738198703545484, "acc_norm": 0.5727272727272728, "acc_norm_stderr": 0.04738198703545484 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.6408163265306123, "acc_stderr": 0.030713560455108493, "acc_norm": 0.6408163265306123, "acc_norm_stderr": 0.030713560455108493 }, "harness|hendrycksTest-sociology|5": { "acc": 0.6865671641791045, "acc_stderr": 0.03280188205348644, "acc_norm": 0.6865671641791045, "acc_norm_stderr": 0.03280188205348644 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.72, "acc_stderr": 0.04512608598542127, "acc_norm": 0.72, "acc_norm_stderr": 0.04512608598542127 }, "harness|hendrycksTest-virology|5": { "acc": 0.45180722891566266, "acc_stderr": 0.03874371556587953, "acc_norm": 0.45180722891566266, "acc_norm_stderr": 0.03874371556587953 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.7192982456140351, "acc_stderr": 0.03446296217088427, "acc_norm": 0.7192982456140351, "acc_norm_stderr": 0.03446296217088427 }, "harness|truthfulqa:mc|0": { "mc1": 0.3268053855569155, "mc1_stderr": 0.01641987473113503, "mc2": 0.5020042655317223, "mc2_stderr": 0.015409102519026984 }, "harness|winogrande|5": { "acc": 0.7221783741120757, "acc_stderr": 0.012588918183871596 }, "harness|gsm8k|5": { "acc": 0.15845337376800606, "acc_stderr": 0.010058474790238962 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_Charlie911__MultiLoRA-mmlu
[ "region:us" ]
2024-02-02T00:19:02+00:00
{"pretty_name": "Evaluation run of Charlie911/MultiLoRA-mmlu", "dataset_summary": "Dataset automatically created during the evaluation run of model [Charlie911/MultiLoRA-mmlu](https://huggingface.co/Charlie911/MultiLoRA-mmlu) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Charlie911__MultiLoRA-mmlu\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-02T00:16:37.919745](https://huggingface.co/datasets/open-llm-leaderboard/details_Charlie911__MultiLoRA-mmlu/blob/main/results_2024-02-02T00-16-37.919745.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.49682393519091933,\n \"acc_stderr\": 0.034234549326781244,\n \"acc_norm\": 0.5024513039750834,\n \"acc_norm_stderr\": 0.03499839555197597,\n \"mc1\": 0.3268053855569155,\n \"mc1_stderr\": 0.01641987473113503,\n \"mc2\": 0.5020042655317223,\n \"mc2_stderr\": 0.015409102519026984\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.49658703071672355,\n \"acc_stderr\": 0.014611050403244077,\n \"acc_norm\": 0.5238907849829352,\n \"acc_norm_stderr\": 0.014594701798071654\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5803624775941048,\n \"acc_stderr\": 0.004924910433106353,\n \"acc_norm\": 0.7720573590918144,\n \"acc_norm_stderr\": 0.0041864806453155625\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.45185185185185184,\n \"acc_stderr\": 0.04299268905480864,\n \"acc_norm\": 0.45185185185185184,\n \"acc_norm_stderr\": 0.04299268905480864\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.506578947368421,\n \"acc_stderr\": 0.040685900502249704,\n \"acc_norm\": 0.506578947368421,\n \"acc_norm_stderr\": 0.040685900502249704\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.5169811320754717,\n \"acc_stderr\": 0.030755120364119905,\n \"acc_norm\": 0.5169811320754717,\n \"acc_norm_stderr\": 0.030755120364119905\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.04181210050035455,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.04181210050035455\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.43352601156069365,\n \"acc_stderr\": 0.03778621079092055,\n \"acc_norm\": 0.43352601156069365,\n \"acc_norm_stderr\": 0.03778621079092055\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.16666666666666666,\n \"acc_stderr\": 0.03708284662416542,\n \"acc_norm\": 0.16666666666666666,\n \"acc_norm_stderr\": 0.03708284662416542\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.65,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.43829787234042555,\n \"acc_stderr\": 0.032436186361081004,\n \"acc_norm\": 0.43829787234042555,\n \"acc_norm_stderr\": 0.032436186361081004\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2894736842105263,\n \"acc_stderr\": 0.04266339443159393,\n \"acc_norm\": 0.2894736842105263,\n \"acc_norm_stderr\": 0.04266339443159393\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.4689655172413793,\n \"acc_stderr\": 0.04158632762097828,\n \"acc_norm\": 0.4689655172413793,\n \"acc_norm_stderr\": 0.04158632762097828\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.2857142857142857,\n \"acc_stderr\": 0.023266512213730578,\n \"acc_norm\": 0.2857142857142857,\n \"acc_norm_stderr\": 0.023266512213730578\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3253968253968254,\n \"acc_stderr\": 0.04190596438871136,\n \"acc_norm\": 0.3253968253968254,\n \"acc_norm_stderr\": 0.04190596438871136\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.535483870967742,\n \"acc_stderr\": 0.02837228779796293,\n \"acc_norm\": 0.535483870967742,\n \"acc_norm_stderr\": 0.02837228779796293\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.39901477832512317,\n \"acc_stderr\": 0.03445487686264715,\n \"acc_norm\": 0.39901477832512317,\n \"acc_norm_stderr\": 0.03445487686264715\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.6303030303030303,\n \"acc_stderr\": 0.03769430314512567,\n \"acc_norm\": 0.6303030303030303,\n \"acc_norm_stderr\": 0.03769430314512567\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.5909090909090909,\n \"acc_stderr\": 0.03502975799413007,\n \"acc_norm\": 0.5909090909090909,\n \"acc_norm_stderr\": 0.03502975799413007\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.7046632124352331,\n \"acc_stderr\": 0.03292296639155141,\n \"acc_norm\": 0.7046632124352331,\n \"acc_norm_stderr\": 0.03292296639155141\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.4717948717948718,\n \"acc_stderr\": 0.025310639254933893,\n \"acc_norm\": 0.4717948717948718,\n \"acc_norm_stderr\": 0.025310639254933893\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.24074074074074073,\n \"acc_stderr\": 0.026067159222275805,\n \"acc_norm\": 0.24074074074074073,\n \"acc_norm_stderr\": 0.026067159222275805\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.4369747899159664,\n \"acc_stderr\": 0.03221943636566196,\n \"acc_norm\": 0.4369747899159664,\n \"acc_norm_stderr\": 0.03221943636566196\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.271523178807947,\n \"acc_stderr\": 0.03631329803969653,\n \"acc_norm\": 0.271523178807947,\n \"acc_norm_stderr\": 0.03631329803969653\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.6770642201834862,\n \"acc_stderr\": 0.02004811592341531,\n \"acc_norm\": 0.6770642201834862,\n \"acc_norm_stderr\": 0.02004811592341531\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.39351851851851855,\n \"acc_stderr\": 0.03331747876370312,\n \"acc_norm\": 0.39351851851851855,\n \"acc_norm_stderr\": 0.03331747876370312\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.696078431372549,\n \"acc_stderr\": 0.032282103870378935,\n \"acc_norm\": 0.696078431372549,\n \"acc_norm_stderr\": 0.032282103870378935\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.6877637130801688,\n \"acc_stderr\": 0.03016513786784701,\n \"acc_norm\": 0.6877637130801688,\n \"acc_norm_stderr\": 0.03016513786784701\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5560538116591929,\n \"acc_stderr\": 0.03334625674242728,\n \"acc_norm\": 0.5560538116591929,\n \"acc_norm_stderr\": 0.03334625674242728\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.5877862595419847,\n \"acc_stderr\": 0.04317171194870254,\n \"acc_norm\": 0.5877862595419847,\n \"acc_norm_stderr\": 0.04317171194870254\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.5537190082644629,\n \"acc_stderr\": 0.0453793517794788,\n \"acc_norm\": 0.5537190082644629,\n \"acc_norm_stderr\": 0.0453793517794788\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5648148148148148,\n \"acc_stderr\": 0.04792898170907061,\n \"acc_norm\": 0.5648148148148148,\n \"acc_norm_stderr\": 0.04792898170907061\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.5337423312883436,\n \"acc_stderr\": 0.039194155450484096,\n \"acc_norm\": 0.5337423312883436,\n \"acc_norm_stderr\": 0.039194155450484096\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4375,\n \"acc_stderr\": 0.04708567521880525,\n \"acc_norm\": 0.4375,\n \"acc_norm_stderr\": 0.04708567521880525\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.6699029126213593,\n \"acc_stderr\": 0.0465614711001235,\n \"acc_norm\": 0.6699029126213593,\n \"acc_norm_stderr\": 0.0465614711001235\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7692307692307693,\n \"acc_stderr\": 0.0276019213814176,\n \"acc_norm\": 0.7692307692307693,\n \"acc_norm_stderr\": 0.0276019213814176\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.669220945083014,\n \"acc_stderr\": 0.01682481846256376,\n \"acc_norm\": 0.669220945083014,\n \"acc_norm_stderr\": 0.01682481846256376\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.546242774566474,\n \"acc_stderr\": 0.02680372058320617,\n \"acc_norm\": 0.546242774566474,\n \"acc_norm_stderr\": 0.02680372058320617\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24022346368715083,\n \"acc_stderr\": 0.014288343803925293,\n \"acc_norm\": 0.24022346368715083,\n \"acc_norm_stderr\": 0.014288343803925293\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.5359477124183006,\n \"acc_stderr\": 0.028555827516528784,\n \"acc_norm\": 0.5359477124183006,\n \"acc_norm_stderr\": 0.028555827516528784\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5787781350482315,\n \"acc_stderr\": 0.02804339985821063,\n \"acc_norm\": 0.5787781350482315,\n \"acc_norm_stderr\": 0.02804339985821063\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.558641975308642,\n \"acc_stderr\": 0.027628737155668767,\n \"acc_norm\": 0.558641975308642,\n \"acc_norm_stderr\": 0.027628737155668767\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.36524822695035464,\n \"acc_stderr\": 0.02872386385328128,\n \"acc_norm\": 0.36524822695035464,\n \"acc_norm_stderr\": 0.02872386385328128\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.363754889178618,\n \"acc_stderr\": 0.012286991879902887,\n \"acc_norm\": 0.363754889178618,\n \"acc_norm_stderr\": 0.012286991879902887\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.5404411764705882,\n \"acc_stderr\": 0.03027332507734575,\n \"acc_norm\": 0.5404411764705882,\n \"acc_norm_stderr\": 0.03027332507734575\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.4869281045751634,\n \"acc_stderr\": 0.020220920829626916,\n \"acc_norm\": 0.4869281045751634,\n \"acc_norm_stderr\": 0.020220920829626916\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5727272727272728,\n \"acc_stderr\": 0.04738198703545484,\n \"acc_norm\": 0.5727272727272728,\n \"acc_norm_stderr\": 0.04738198703545484\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.6408163265306123,\n \"acc_stderr\": 0.030713560455108493,\n \"acc_norm\": 0.6408163265306123,\n \"acc_norm_stderr\": 0.030713560455108493\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6865671641791045,\n \"acc_stderr\": 0.03280188205348644,\n \"acc_norm\": 0.6865671641791045,\n \"acc_norm_stderr\": 0.03280188205348644\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.45180722891566266,\n \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.45180722891566266,\n \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.7192982456140351,\n \"acc_stderr\": 0.03446296217088427,\n \"acc_norm\": 0.7192982456140351,\n \"acc_norm_stderr\": 0.03446296217088427\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3268053855569155,\n \"mc1_stderr\": 0.01641987473113503,\n \"mc2\": 0.5020042655317223,\n \"mc2_stderr\": 0.015409102519026984\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7221783741120757,\n \"acc_stderr\": 0.012588918183871596\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.15845337376800606,\n \"acc_stderr\": 0.010058474790238962\n }\n}\n```", "repo_url": "https://huggingface.co/Charlie911/MultiLoRA-mmlu", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_02T00_16_37.919745", "path": ["**/details_harness|arc:challenge|25_2024-02-02T00-16-37.919745.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-02T00-16-37.919745.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_02T00_16_37.919745", "path": ["**/details_harness|gsm8k|5_2024-02-02T00-16-37.919745.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-02T00-16-37.919745.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_02T00_16_37.919745", "path": ["**/details_harness|hellaswag|10_2024-02-02T00-16-37.919745.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-02T00-16-37.919745.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_02T00_16_37.919745", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T00-16-37.919745.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-02T00-16-37.919745.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-02T00-16-37.919745.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T00-16-37.919745.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T00-16-37.919745.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-02T00-16-37.919745.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T00-16-37.919745.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T00-16-37.919745.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T00-16-37.919745.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T00-16-37.919745.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-02T00-16-37.919745.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-02T00-16-37.919745.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T00-16-37.919745.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-02T00-16-37.919745.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T00-16-37.919745.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T00-16-37.919745.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T00-16-37.919745.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-02T00-16-37.919745.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T00-16-37.919745.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T00-16-37.919745.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T00-16-37.919745.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T00-16-37.919745.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T00-16-37.919745.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T00-16-37.919745.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T00-16-37.919745.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T00-16-37.919745.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T00-16-37.919745.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T00-16-37.919745.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T00-16-37.919745.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T00-16-37.919745.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T00-16-37.919745.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T00-16-37.919745.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-02T00-16-37.919745.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T00-16-37.919745.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-02T00-16-37.919745.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T00-16-37.919745.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T00-16-37.919745.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T00-16-37.919745.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-02T00-16-37.919745.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-02T00-16-37.919745.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T00-16-37.919745.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T00-16-37.919745.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T00-16-37.919745.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T00-16-37.919745.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-02T00-16-37.919745.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-02T00-16-37.919745.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-02T00-16-37.919745.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T00-16-37.919745.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-02T00-16-37.919745.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T00-16-37.919745.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T00-16-37.919745.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-02T00-16-37.919745.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-02T00-16-37.919745.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-02T00-16-37.919745.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T00-16-37.919745.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-02T00-16-37.919745.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-02T00-16-37.919745.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T00-16-37.919745.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-02T00-16-37.919745.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-02T00-16-37.919745.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T00-16-37.919745.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T00-16-37.919745.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-02T00-16-37.919745.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T00-16-37.919745.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T00-16-37.919745.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T00-16-37.919745.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T00-16-37.919745.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-02T00-16-37.919745.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-02T00-16-37.919745.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T00-16-37.919745.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-02T00-16-37.919745.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T00-16-37.919745.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T00-16-37.919745.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T00-16-37.919745.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-02T00-16-37.919745.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T00-16-37.919745.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T00-16-37.919745.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T00-16-37.919745.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T00-16-37.919745.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T00-16-37.919745.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T00-16-37.919745.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T00-16-37.919745.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T00-16-37.919745.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T00-16-37.919745.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T00-16-37.919745.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T00-16-37.919745.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T00-16-37.919745.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T00-16-37.919745.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T00-16-37.919745.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-02T00-16-37.919745.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T00-16-37.919745.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-02T00-16-37.919745.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T00-16-37.919745.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T00-16-37.919745.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T00-16-37.919745.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-02T00-16-37.919745.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-02T00-16-37.919745.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T00-16-37.919745.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T00-16-37.919745.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T00-16-37.919745.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T00-16-37.919745.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-02T00-16-37.919745.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-02T00-16-37.919745.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-02T00-16-37.919745.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T00-16-37.919745.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-02T00-16-37.919745.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T00-16-37.919745.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T00-16-37.919745.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-02T00-16-37.919745.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-02T00-16-37.919745.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-02T00-16-37.919745.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T00-16-37.919745.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-02T00-16-37.919745.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-02T00-16-37.919745.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_02T00_16_37.919745", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T00-16-37.919745.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T00-16-37.919745.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_02T00_16_37.919745", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-02T00-16-37.919745.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-02T00-16-37.919745.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_02T00_16_37.919745", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-02T00-16-37.919745.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-02T00-16-37.919745.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_02T00_16_37.919745", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T00-16-37.919745.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T00-16-37.919745.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_02T00_16_37.919745", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T00-16-37.919745.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T00-16-37.919745.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_02T00_16_37.919745", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-02T00-16-37.919745.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-02T00-16-37.919745.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_02T00_16_37.919745", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T00-16-37.919745.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T00-16-37.919745.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_02T00_16_37.919745", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T00-16-37.919745.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T00-16-37.919745.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_02T00_16_37.919745", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T00-16-37.919745.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T00-16-37.919745.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_02T00_16_37.919745", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T00-16-37.919745.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T00-16-37.919745.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_02T00_16_37.919745", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-02T00-16-37.919745.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-02T00-16-37.919745.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_02T00_16_37.919745", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-02T00-16-37.919745.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-02T00-16-37.919745.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_02T00_16_37.919745", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T00-16-37.919745.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T00-16-37.919745.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_02T00_16_37.919745", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-02T00-16-37.919745.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-02T00-16-37.919745.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_02T00_16_37.919745", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T00-16-37.919745.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T00-16-37.919745.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_02T00_16_37.919745", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T00-16-37.919745.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T00-16-37.919745.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_02T00_16_37.919745", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T00-16-37.919745.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T00-16-37.919745.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_02T00_16_37.919745", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-02T00-16-37.919745.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-02T00-16-37.919745.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_02T00_16_37.919745", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T00-16-37.919745.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T00-16-37.919745.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_02T00_16_37.919745", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T00-16-37.919745.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T00-16-37.919745.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_02T00_16_37.919745", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T00-16-37.919745.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T00-16-37.919745.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_02T00_16_37.919745", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T00-16-37.919745.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T00-16-37.919745.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_02T00_16_37.919745", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T00-16-37.919745.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T00-16-37.919745.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_02T00_16_37.919745", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T00-16-37.919745.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T00-16-37.919745.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_02T00_16_37.919745", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T00-16-37.919745.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T00-16-37.919745.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_02T00_16_37.919745", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T00-16-37.919745.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T00-16-37.919745.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_02T00_16_37.919745", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T00-16-37.919745.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T00-16-37.919745.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_02T00_16_37.919745", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T00-16-37.919745.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T00-16-37.919745.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_02T00_16_37.919745", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T00-16-37.919745.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T00-16-37.919745.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_02T00_16_37.919745", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T00-16-37.919745.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T00-16-37.919745.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_02T00_16_37.919745", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T00-16-37.919745.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T00-16-37.919745.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_02T00_16_37.919745", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T00-16-37.919745.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T00-16-37.919745.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_02T00_16_37.919745", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-02T00-16-37.919745.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-02T00-16-37.919745.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_02T00_16_37.919745", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T00-16-37.919745.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T00-16-37.919745.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_02T00_16_37.919745", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-02T00-16-37.919745.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-02T00-16-37.919745.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_02T00_16_37.919745", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T00-16-37.919745.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T00-16-37.919745.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_02T00_16_37.919745", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T00-16-37.919745.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T00-16-37.919745.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_02T00_16_37.919745", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T00-16-37.919745.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T00-16-37.919745.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_02T00_16_37.919745", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-02T00-16-37.919745.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-02T00-16-37.919745.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_02T00_16_37.919745", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-02T00-16-37.919745.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-02T00-16-37.919745.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_02T00_16_37.919745", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T00-16-37.919745.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T00-16-37.919745.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_02T00_16_37.919745", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T00-16-37.919745.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T00-16-37.919745.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_02T00_16_37.919745", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T00-16-37.919745.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T00-16-37.919745.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_02T00_16_37.919745", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T00-16-37.919745.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T00-16-37.919745.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_02T00_16_37.919745", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-02T00-16-37.919745.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-02T00-16-37.919745.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_02T00_16_37.919745", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-02T00-16-37.919745.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-02T00-16-37.919745.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_02T00_16_37.919745", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-02T00-16-37.919745.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-02T00-16-37.919745.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_02T00_16_37.919745", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T00-16-37.919745.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T00-16-37.919745.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_02T00_16_37.919745", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-02T00-16-37.919745.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-02T00-16-37.919745.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_02T00_16_37.919745", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T00-16-37.919745.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T00-16-37.919745.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_02T00_16_37.919745", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T00-16-37.919745.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T00-16-37.919745.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_02T00_16_37.919745", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-02T00-16-37.919745.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-02T00-16-37.919745.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_02T00_16_37.919745", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-02T00-16-37.919745.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-02T00-16-37.919745.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_02T00_16_37.919745", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-02T00-16-37.919745.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-02T00-16-37.919745.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_02T00_16_37.919745", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T00-16-37.919745.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T00-16-37.919745.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_02T00_16_37.919745", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-02T00-16-37.919745.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-02T00-16-37.919745.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_02T00_16_37.919745", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-02T00-16-37.919745.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-02T00-16-37.919745.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_02T00_16_37.919745", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-02T00-16-37.919745.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-02T00-16-37.919745.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_02T00_16_37.919745", "path": ["**/details_harness|winogrande|5_2024-02-02T00-16-37.919745.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-02T00-16-37.919745.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_02T00_16_37.919745", "path": ["results_2024-02-02T00-16-37.919745.parquet"]}, {"split": "latest", "path": ["results_2024-02-02T00-16-37.919745.parquet"]}]}]}
2024-02-02T00:19:41+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of Charlie911/MultiLoRA-mmlu Dataset automatically created during the evaluation run of model Charlie911/MultiLoRA-mmlu on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-02-02T00:16:37.919745(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of Charlie911/MultiLoRA-mmlu\n\n\n\nDataset automatically created during the evaluation run of model Charlie911/MultiLoRA-mmlu on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-02-02T00:16:37.919745(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of Charlie911/MultiLoRA-mmlu\n\n\n\nDataset automatically created during the evaluation run of model Charlie911/MultiLoRA-mmlu on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-02-02T00:16:37.919745(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
6b57145083534ce41fe81336afadfb4092669d16
# Dataset Card for Evaluation run of ArianAskari/NeuralHermes-2.5-Mistral-7B <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [ArianAskari/NeuralHermes-2.5-Mistral-7B](https://huggingface.co/ArianAskari/NeuralHermes-2.5-Mistral-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_ArianAskari__NeuralHermes-2.5-Mistral-7B", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-02-02T00:23:23.624416](https://huggingface.co/datasets/open-llm-leaderboard/details_ArianAskari__NeuralHermes-2.5-Mistral-7B/blob/main/results_2024-02-02T00-23-23.624416.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6381118013498635, "acc_stderr": 0.03228022346468125, "acc_norm": 0.6407857748626177, "acc_norm_stderr": 0.032921741406603505, "mc1": 0.3598531211750306, "mc1_stderr": 0.016801860466677157, "mc2": 0.5223229691481044, "mc2_stderr": 0.015242725441292206 }, "harness|arc:challenge|25": { "acc": 0.6075085324232082, "acc_stderr": 0.014269634635670733, "acc_norm": 0.6467576791808873, "acc_norm_stderr": 0.013967822714840055 }, "harness|hellaswag|10": { "acc": 0.6520613423620792, "acc_stderr": 0.00475342980664544, "acc_norm": 0.842760406293567, "acc_norm_stderr": 0.003632825479128597 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.34, "acc_stderr": 0.04760952285695236, "acc_norm": 0.34, "acc_norm_stderr": 0.04760952285695236 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.5851851851851851, "acc_stderr": 0.04256193767901408, "acc_norm": 0.5851851851851851, "acc_norm_stderr": 0.04256193767901408 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.6973684210526315, "acc_stderr": 0.037385206761196686, "acc_norm": 0.6973684210526315, "acc_norm_stderr": 0.037385206761196686 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.57, "acc_stderr": 0.049756985195624284, "acc_norm": 0.57, "acc_norm_stderr": 0.049756985195624284 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.6792452830188679, "acc_stderr": 0.028727502957880267, "acc_norm": 0.6792452830188679, "acc_norm_stderr": 0.028727502957880267 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.7569444444444444, "acc_stderr": 0.03586879280080341, "acc_norm": 0.7569444444444444, "acc_norm_stderr": 0.03586879280080341 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.46, "acc_stderr": 0.05009082659620333, "acc_norm": 0.46, "acc_norm_stderr": 0.05009082659620333 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.46, "acc_stderr": 0.05009082659620333, "acc_norm": 0.46, "acc_norm_stderr": 0.05009082659620333 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.3, "acc_stderr": 0.046056618647183814, "acc_norm": 0.3, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6184971098265896, "acc_stderr": 0.03703851193099521, "acc_norm": 0.6184971098265896, "acc_norm_stderr": 0.03703851193099521 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.4019607843137255, "acc_stderr": 0.04878608714466996, "acc_norm": 0.4019607843137255, "acc_norm_stderr": 0.04878608714466996 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.75, "acc_stderr": 0.04351941398892446, "acc_norm": 0.75, "acc_norm_stderr": 0.04351941398892446 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.5617021276595745, "acc_stderr": 0.03243618636108101, "acc_norm": 0.5617021276595745, "acc_norm_stderr": 0.03243618636108101 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.49122807017543857, "acc_stderr": 0.04702880432049615, "acc_norm": 0.49122807017543857, "acc_norm_stderr": 0.04702880432049615 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5241379310344828, "acc_stderr": 0.0416180850350153, "acc_norm": 0.5241379310344828, "acc_norm_stderr": 0.0416180850350153 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.4312169312169312, "acc_stderr": 0.025506481698138215, "acc_norm": 0.4312169312169312, "acc_norm_stderr": 0.025506481698138215 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.4603174603174603, "acc_stderr": 0.04458029125470973, "acc_norm": 0.4603174603174603, "acc_norm_stderr": 0.04458029125470973 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.37, "acc_stderr": 0.04852365870939099, "acc_norm": 0.37, "acc_norm_stderr": 0.04852365870939099 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.7903225806451613, "acc_stderr": 0.023157879349083525, "acc_norm": 0.7903225806451613, "acc_norm_stderr": 0.023157879349083525 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.5123152709359606, "acc_stderr": 0.035169204442208966, "acc_norm": 0.5123152709359606, "acc_norm_stderr": 0.035169204442208966 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.67, "acc_stderr": 0.047258156262526066, "acc_norm": 0.67, "acc_norm_stderr": 0.047258156262526066 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.793939393939394, "acc_stderr": 0.031584153240477114, "acc_norm": 0.793939393939394, "acc_norm_stderr": 0.031584153240477114 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.803030303030303, "acc_stderr": 0.02833560973246336, "acc_norm": 0.803030303030303, "acc_norm_stderr": 0.02833560973246336 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.8860103626943006, "acc_stderr": 0.022935144053919443, "acc_norm": 0.8860103626943006, "acc_norm_stderr": 0.022935144053919443 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.6076923076923076, "acc_stderr": 0.024756000382130956, "acc_norm": 0.6076923076923076, "acc_norm_stderr": 0.024756000382130956 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.2962962962962963, "acc_stderr": 0.02784081149587193, "acc_norm": 0.2962962962962963, "acc_norm_stderr": 0.02784081149587193 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.6764705882352942, "acc_stderr": 0.030388353551886797, "acc_norm": 0.6764705882352942, "acc_norm_stderr": 0.030388353551886797 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.33774834437086093, "acc_stderr": 0.0386155754625517, "acc_norm": 0.33774834437086093, "acc_norm_stderr": 0.0386155754625517 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8311926605504587, "acc_stderr": 0.016060056268530343, "acc_norm": 0.8311926605504587, "acc_norm_stderr": 0.016060056268530343 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.5231481481481481, "acc_stderr": 0.034063153607115086, "acc_norm": 0.5231481481481481, "acc_norm_stderr": 0.034063153607115086 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.7990196078431373, "acc_stderr": 0.02812597226565437, "acc_norm": 0.7990196078431373, "acc_norm_stderr": 0.02812597226565437 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.810126582278481, "acc_stderr": 0.02553010046023349, "acc_norm": 0.810126582278481, "acc_norm_stderr": 0.02553010046023349 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.7085201793721974, "acc_stderr": 0.030500283176545847, "acc_norm": 0.7085201793721974, "acc_norm_stderr": 0.030500283176545847 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.7862595419847328, "acc_stderr": 0.0359546161177469, "acc_norm": 0.7862595419847328, "acc_norm_stderr": 0.0359546161177469 }, "harness|hendrycksTest-international_law|5": { "acc": 0.7603305785123967, "acc_stderr": 0.03896878985070416, "acc_norm": 0.7603305785123967, "acc_norm_stderr": 0.03896878985070416 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7870370370370371, "acc_stderr": 0.039578354719809805, "acc_norm": 0.7870370370370371, "acc_norm_stderr": 0.039578354719809805 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7852760736196319, "acc_stderr": 0.032262193772867744, "acc_norm": 0.7852760736196319, "acc_norm_stderr": 0.032262193772867744 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.5089285714285714, "acc_stderr": 0.04745033255489123, "acc_norm": 0.5089285714285714, "acc_norm_stderr": 0.04745033255489123 }, "harness|hendrycksTest-management|5": { "acc": 0.7766990291262136, "acc_stderr": 0.04123553189891431, "acc_norm": 0.7766990291262136, "acc_norm_stderr": 0.04123553189891431 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8632478632478633, "acc_stderr": 0.02250903393707781, "acc_norm": 0.8632478632478633, "acc_norm_stderr": 0.02250903393707781 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.7, "acc_stderr": 0.046056618647183814, "acc_norm": 0.7, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8237547892720306, "acc_stderr": 0.013625556907993459, "acc_norm": 0.8237547892720306, "acc_norm_stderr": 0.013625556907993459 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.7196531791907514, "acc_stderr": 0.02418242749657761, "acc_norm": 0.7196531791907514, "acc_norm_stderr": 0.02418242749657761 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.3094972067039106, "acc_stderr": 0.015461169002371542, "acc_norm": 0.3094972067039106, "acc_norm_stderr": 0.015461169002371542 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.7549019607843137, "acc_stderr": 0.024630048979824782, "acc_norm": 0.7549019607843137, "acc_norm_stderr": 0.024630048979824782 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.684887459807074, "acc_stderr": 0.026385273703464485, "acc_norm": 0.684887459807074, "acc_norm_stderr": 0.026385273703464485 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.7530864197530864, "acc_stderr": 0.02399350170904211, "acc_norm": 0.7530864197530864, "acc_norm_stderr": 0.02399350170904211 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.49645390070921985, "acc_stderr": 0.02982674915328092, "acc_norm": 0.49645390070921985, "acc_norm_stderr": 0.02982674915328092 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.4706649282920469, "acc_stderr": 0.012748238397365549, "acc_norm": 0.4706649282920469, "acc_norm_stderr": 0.012748238397365549 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.6838235294117647, "acc_stderr": 0.028245687391462937, "acc_norm": 0.6838235294117647, "acc_norm_stderr": 0.028245687391462937 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6748366013071896, "acc_stderr": 0.018950886770806304, "acc_norm": 0.6748366013071896, "acc_norm_stderr": 0.018950886770806304 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6545454545454545, "acc_stderr": 0.04554619617541054, "acc_norm": 0.6545454545454545, "acc_norm_stderr": 0.04554619617541054 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.7387755102040816, "acc_stderr": 0.028123429335142773, "acc_norm": 0.7387755102040816, "acc_norm_stderr": 0.028123429335142773 }, "harness|hendrycksTest-sociology|5": { "acc": 0.8159203980099502, "acc_stderr": 0.027403859410786845, "acc_norm": 0.8159203980099502, "acc_norm_stderr": 0.027403859410786845 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.88, "acc_stderr": 0.03265986323710906, "acc_norm": 0.88, "acc_norm_stderr": 0.03265986323710906 }, "harness|hendrycksTest-virology|5": { "acc": 0.5542168674698795, "acc_stderr": 0.03869543323472101, "acc_norm": 0.5542168674698795, "acc_norm_stderr": 0.03869543323472101 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8304093567251462, "acc_stderr": 0.02878210810540171, "acc_norm": 0.8304093567251462, "acc_norm_stderr": 0.02878210810540171 }, "harness|truthfulqa:mc|0": { "mc1": 0.3598531211750306, "mc1_stderr": 0.016801860466677157, "mc2": 0.5223229691481044, "mc2_stderr": 0.015242725441292206 }, "harness|winogrande|5": { "acc": 0.7797947908445146, "acc_stderr": 0.011646276755089688 }, "harness|gsm8k|5": { "acc": 0.5686125852918877, "acc_stderr": 0.013642195352511564 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_ArianAskari__NeuralHermes-2.5-Mistral-7B
[ "region:us" ]
2024-02-02T00:25:42+00:00
{"pretty_name": "Evaluation run of ArianAskari/NeuralHermes-2.5-Mistral-7B", "dataset_summary": "Dataset automatically created during the evaluation run of model [ArianAskari/NeuralHermes-2.5-Mistral-7B](https://huggingface.co/ArianAskari/NeuralHermes-2.5-Mistral-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_ArianAskari__NeuralHermes-2.5-Mistral-7B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-02T00:23:23.624416](https://huggingface.co/datasets/open-llm-leaderboard/details_ArianAskari__NeuralHermes-2.5-Mistral-7B/blob/main/results_2024-02-02T00-23-23.624416.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6381118013498635,\n \"acc_stderr\": 0.03228022346468125,\n \"acc_norm\": 0.6407857748626177,\n \"acc_norm_stderr\": 0.032921741406603505,\n \"mc1\": 0.3598531211750306,\n \"mc1_stderr\": 0.016801860466677157,\n \"mc2\": 0.5223229691481044,\n \"mc2_stderr\": 0.015242725441292206\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6075085324232082,\n \"acc_stderr\": 0.014269634635670733,\n \"acc_norm\": 0.6467576791808873,\n \"acc_norm_stderr\": 0.013967822714840055\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6520613423620792,\n \"acc_stderr\": 0.00475342980664544,\n \"acc_norm\": 0.842760406293567,\n \"acc_norm_stderr\": 0.003632825479128597\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5851851851851851,\n \"acc_stderr\": 0.04256193767901408,\n \"acc_norm\": 0.5851851851851851,\n \"acc_norm_stderr\": 0.04256193767901408\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6973684210526315,\n \"acc_stderr\": 0.037385206761196686,\n \"acc_norm\": 0.6973684210526315,\n \"acc_norm_stderr\": 0.037385206761196686\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.57,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6792452830188679,\n \"acc_stderr\": 0.028727502957880267,\n \"acc_norm\": 0.6792452830188679,\n \"acc_norm_stderr\": 0.028727502957880267\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7569444444444444,\n \"acc_stderr\": 0.03586879280080341,\n \"acc_norm\": 0.7569444444444444,\n \"acc_norm_stderr\": 0.03586879280080341\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6184971098265896,\n \"acc_stderr\": 0.03703851193099521,\n \"acc_norm\": 0.6184971098265896,\n \"acc_norm_stderr\": 0.03703851193099521\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4019607843137255,\n \"acc_stderr\": 0.04878608714466996,\n \"acc_norm\": 0.4019607843137255,\n \"acc_norm_stderr\": 0.04878608714466996\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5617021276595745,\n \"acc_stderr\": 0.03243618636108101,\n \"acc_norm\": 0.5617021276595745,\n \"acc_norm_stderr\": 0.03243618636108101\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.49122807017543857,\n \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.49122807017543857,\n \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5241379310344828,\n \"acc_stderr\": 0.0416180850350153,\n \"acc_norm\": 0.5241379310344828,\n \"acc_norm_stderr\": 0.0416180850350153\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.4312169312169312,\n \"acc_stderr\": 0.025506481698138215,\n \"acc_norm\": 0.4312169312169312,\n \"acc_norm_stderr\": 0.025506481698138215\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4603174603174603,\n \"acc_stderr\": 0.04458029125470973,\n \"acc_norm\": 0.4603174603174603,\n \"acc_norm_stderr\": 0.04458029125470973\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7903225806451613,\n \"acc_stderr\": 0.023157879349083525,\n \"acc_norm\": 0.7903225806451613,\n \"acc_norm_stderr\": 0.023157879349083525\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5123152709359606,\n \"acc_stderr\": 0.035169204442208966,\n \"acc_norm\": 0.5123152709359606,\n \"acc_norm_stderr\": 0.035169204442208966\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.67,\n \"acc_stderr\": 0.047258156262526066,\n \"acc_norm\": 0.67,\n \"acc_norm_stderr\": 0.047258156262526066\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.793939393939394,\n \"acc_stderr\": 0.031584153240477114,\n \"acc_norm\": 0.793939393939394,\n \"acc_norm_stderr\": 0.031584153240477114\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.803030303030303,\n \"acc_stderr\": 0.02833560973246336,\n \"acc_norm\": 0.803030303030303,\n \"acc_norm_stderr\": 0.02833560973246336\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8860103626943006,\n \"acc_stderr\": 0.022935144053919443,\n \"acc_norm\": 0.8860103626943006,\n \"acc_norm_stderr\": 0.022935144053919443\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6076923076923076,\n \"acc_stderr\": 0.024756000382130956,\n \"acc_norm\": 0.6076923076923076,\n \"acc_norm_stderr\": 0.024756000382130956\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.2962962962962963,\n \"acc_stderr\": 0.02784081149587193,\n \"acc_norm\": 0.2962962962962963,\n \"acc_norm_stderr\": 0.02784081149587193\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.030388353551886797,\n \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.030388353551886797\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.33774834437086093,\n \"acc_stderr\": 0.0386155754625517,\n \"acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.0386155754625517\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8311926605504587,\n \"acc_stderr\": 0.016060056268530343,\n \"acc_norm\": 0.8311926605504587,\n \"acc_norm_stderr\": 0.016060056268530343\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5231481481481481,\n \"acc_stderr\": 0.034063153607115086,\n \"acc_norm\": 0.5231481481481481,\n \"acc_norm_stderr\": 0.034063153607115086\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7990196078431373,\n \"acc_stderr\": 0.02812597226565437,\n \"acc_norm\": 0.7990196078431373,\n \"acc_norm_stderr\": 0.02812597226565437\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.810126582278481,\n \"acc_stderr\": 0.02553010046023349,\n \"acc_norm\": 0.810126582278481,\n \"acc_norm_stderr\": 0.02553010046023349\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7085201793721974,\n \"acc_stderr\": 0.030500283176545847,\n \"acc_norm\": 0.7085201793721974,\n \"acc_norm_stderr\": 0.030500283176545847\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7862595419847328,\n \"acc_stderr\": 0.0359546161177469,\n \"acc_norm\": 0.7862595419847328,\n \"acc_norm_stderr\": 0.0359546161177469\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7603305785123967,\n \"acc_stderr\": 0.03896878985070416,\n \"acc_norm\": 0.7603305785123967,\n \"acc_norm_stderr\": 0.03896878985070416\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n \"acc_stderr\": 0.039578354719809805,\n \"acc_norm\": 0.7870370370370371,\n \"acc_norm_stderr\": 0.039578354719809805\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7852760736196319,\n \"acc_stderr\": 0.032262193772867744,\n \"acc_norm\": 0.7852760736196319,\n \"acc_norm_stderr\": 0.032262193772867744\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5089285714285714,\n \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.5089285714285714,\n \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8632478632478633,\n \"acc_stderr\": 0.02250903393707781,\n \"acc_norm\": 0.8632478632478633,\n \"acc_norm_stderr\": 0.02250903393707781\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8237547892720306,\n \"acc_stderr\": 0.013625556907993459,\n \"acc_norm\": 0.8237547892720306,\n \"acc_norm_stderr\": 0.013625556907993459\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7196531791907514,\n \"acc_stderr\": 0.02418242749657761,\n \"acc_norm\": 0.7196531791907514,\n \"acc_norm_stderr\": 0.02418242749657761\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3094972067039106,\n \"acc_stderr\": 0.015461169002371542,\n \"acc_norm\": 0.3094972067039106,\n \"acc_norm_stderr\": 0.015461169002371542\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7549019607843137,\n \"acc_stderr\": 0.024630048979824782,\n \"acc_norm\": 0.7549019607843137,\n \"acc_norm_stderr\": 0.024630048979824782\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.684887459807074,\n \"acc_stderr\": 0.026385273703464485,\n \"acc_norm\": 0.684887459807074,\n \"acc_norm_stderr\": 0.026385273703464485\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7530864197530864,\n \"acc_stderr\": 0.02399350170904211,\n \"acc_norm\": 0.7530864197530864,\n \"acc_norm_stderr\": 0.02399350170904211\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.49645390070921985,\n \"acc_stderr\": 0.02982674915328092,\n \"acc_norm\": 0.49645390070921985,\n \"acc_norm_stderr\": 0.02982674915328092\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4706649282920469,\n \"acc_stderr\": 0.012748238397365549,\n \"acc_norm\": 0.4706649282920469,\n \"acc_norm_stderr\": 0.012748238397365549\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6838235294117647,\n \"acc_stderr\": 0.028245687391462937,\n \"acc_norm\": 0.6838235294117647,\n \"acc_norm_stderr\": 0.028245687391462937\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6748366013071896,\n \"acc_stderr\": 0.018950886770806304,\n \"acc_norm\": 0.6748366013071896,\n \"acc_norm_stderr\": 0.018950886770806304\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7387755102040816,\n \"acc_stderr\": 0.028123429335142773,\n \"acc_norm\": 0.7387755102040816,\n \"acc_norm_stderr\": 0.028123429335142773\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8159203980099502,\n \"acc_stderr\": 0.027403859410786845,\n \"acc_norm\": 0.8159203980099502,\n \"acc_norm_stderr\": 0.027403859410786845\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.88,\n \"acc_stderr\": 0.03265986323710906,\n \"acc_norm\": 0.88,\n \"acc_norm_stderr\": 0.03265986323710906\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5542168674698795,\n \"acc_stderr\": 0.03869543323472101,\n \"acc_norm\": 0.5542168674698795,\n \"acc_norm_stderr\": 0.03869543323472101\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3598531211750306,\n \"mc1_stderr\": 0.016801860466677157,\n \"mc2\": 0.5223229691481044,\n \"mc2_stderr\": 0.015242725441292206\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7797947908445146,\n \"acc_stderr\": 0.011646276755089688\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5686125852918877,\n \"acc_stderr\": 0.013642195352511564\n }\n}\n```", "repo_url": "https://huggingface.co/ArianAskari/NeuralHermes-2.5-Mistral-7B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_02T00_23_23.624416", "path": ["**/details_harness|arc:challenge|25_2024-02-02T00-23-23.624416.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-02T00-23-23.624416.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_02T00_23_23.624416", "path": ["**/details_harness|gsm8k|5_2024-02-02T00-23-23.624416.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-02T00-23-23.624416.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_02T00_23_23.624416", "path": ["**/details_harness|hellaswag|10_2024-02-02T00-23-23.624416.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-02T00-23-23.624416.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_02T00_23_23.624416", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T00-23-23.624416.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-02T00-23-23.624416.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-02T00-23-23.624416.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T00-23-23.624416.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T00-23-23.624416.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-02T00-23-23.624416.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T00-23-23.624416.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T00-23-23.624416.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T00-23-23.624416.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T00-23-23.624416.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-02T00-23-23.624416.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-02T00-23-23.624416.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T00-23-23.624416.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-02T00-23-23.624416.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T00-23-23.624416.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T00-23-23.624416.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T00-23-23.624416.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-02T00-23-23.624416.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T00-23-23.624416.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T00-23-23.624416.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T00-23-23.624416.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T00-23-23.624416.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T00-23-23.624416.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T00-23-23.624416.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T00-23-23.624416.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T00-23-23.624416.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T00-23-23.624416.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T00-23-23.624416.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T00-23-23.624416.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T00-23-23.624416.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T00-23-23.624416.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T00-23-23.624416.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-02T00-23-23.624416.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T00-23-23.624416.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-02T00-23-23.624416.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T00-23-23.624416.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T00-23-23.624416.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T00-23-23.624416.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-02T00-23-23.624416.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-02T00-23-23.624416.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T00-23-23.624416.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T00-23-23.624416.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T00-23-23.624416.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T00-23-23.624416.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-02T00-23-23.624416.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-02T00-23-23.624416.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-02T00-23-23.624416.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T00-23-23.624416.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-02T00-23-23.624416.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T00-23-23.624416.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T00-23-23.624416.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-02T00-23-23.624416.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-02T00-23-23.624416.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-02T00-23-23.624416.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T00-23-23.624416.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-02T00-23-23.624416.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-02T00-23-23.624416.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T00-23-23.624416.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-02T00-23-23.624416.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-02T00-23-23.624416.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T00-23-23.624416.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T00-23-23.624416.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-02T00-23-23.624416.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T00-23-23.624416.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T00-23-23.624416.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T00-23-23.624416.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T00-23-23.624416.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-02T00-23-23.624416.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-02T00-23-23.624416.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T00-23-23.624416.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-02T00-23-23.624416.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T00-23-23.624416.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T00-23-23.624416.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T00-23-23.624416.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-02T00-23-23.624416.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T00-23-23.624416.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T00-23-23.624416.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T00-23-23.624416.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T00-23-23.624416.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T00-23-23.624416.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T00-23-23.624416.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T00-23-23.624416.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T00-23-23.624416.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T00-23-23.624416.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T00-23-23.624416.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T00-23-23.624416.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T00-23-23.624416.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T00-23-23.624416.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T00-23-23.624416.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-02T00-23-23.624416.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T00-23-23.624416.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-02T00-23-23.624416.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T00-23-23.624416.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T00-23-23.624416.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T00-23-23.624416.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-02T00-23-23.624416.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-02T00-23-23.624416.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T00-23-23.624416.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T00-23-23.624416.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T00-23-23.624416.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T00-23-23.624416.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-02T00-23-23.624416.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-02T00-23-23.624416.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-02T00-23-23.624416.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T00-23-23.624416.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-02T00-23-23.624416.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T00-23-23.624416.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T00-23-23.624416.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-02T00-23-23.624416.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-02T00-23-23.624416.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-02T00-23-23.624416.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T00-23-23.624416.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-02T00-23-23.624416.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-02T00-23-23.624416.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_02T00_23_23.624416", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T00-23-23.624416.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T00-23-23.624416.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_02T00_23_23.624416", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-02T00-23-23.624416.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-02T00-23-23.624416.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_02T00_23_23.624416", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-02T00-23-23.624416.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-02T00-23-23.624416.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_02T00_23_23.624416", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T00-23-23.624416.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T00-23-23.624416.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_02T00_23_23.624416", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T00-23-23.624416.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T00-23-23.624416.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_02T00_23_23.624416", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-02T00-23-23.624416.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-02T00-23-23.624416.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_02T00_23_23.624416", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T00-23-23.624416.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T00-23-23.624416.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_02T00_23_23.624416", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T00-23-23.624416.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T00-23-23.624416.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_02T00_23_23.624416", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T00-23-23.624416.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T00-23-23.624416.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_02T00_23_23.624416", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T00-23-23.624416.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T00-23-23.624416.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_02T00_23_23.624416", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-02T00-23-23.624416.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-02T00-23-23.624416.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_02T00_23_23.624416", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-02T00-23-23.624416.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-02T00-23-23.624416.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_02T00_23_23.624416", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T00-23-23.624416.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T00-23-23.624416.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_02T00_23_23.624416", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-02T00-23-23.624416.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-02T00-23-23.624416.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_02T00_23_23.624416", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T00-23-23.624416.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T00-23-23.624416.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_02T00_23_23.624416", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T00-23-23.624416.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T00-23-23.624416.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_02T00_23_23.624416", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T00-23-23.624416.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T00-23-23.624416.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_02T00_23_23.624416", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-02T00-23-23.624416.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-02T00-23-23.624416.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_02T00_23_23.624416", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T00-23-23.624416.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T00-23-23.624416.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_02T00_23_23.624416", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T00-23-23.624416.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T00-23-23.624416.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_02T00_23_23.624416", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T00-23-23.624416.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T00-23-23.624416.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_02T00_23_23.624416", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T00-23-23.624416.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T00-23-23.624416.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_02T00_23_23.624416", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T00-23-23.624416.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T00-23-23.624416.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_02T00_23_23.624416", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T00-23-23.624416.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T00-23-23.624416.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_02T00_23_23.624416", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T00-23-23.624416.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T00-23-23.624416.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_02T00_23_23.624416", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T00-23-23.624416.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T00-23-23.624416.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_02T00_23_23.624416", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T00-23-23.624416.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T00-23-23.624416.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_02T00_23_23.624416", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T00-23-23.624416.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T00-23-23.624416.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_02T00_23_23.624416", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T00-23-23.624416.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T00-23-23.624416.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_02T00_23_23.624416", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T00-23-23.624416.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T00-23-23.624416.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_02T00_23_23.624416", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T00-23-23.624416.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T00-23-23.624416.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_02T00_23_23.624416", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T00-23-23.624416.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T00-23-23.624416.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_02T00_23_23.624416", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-02T00-23-23.624416.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-02T00-23-23.624416.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_02T00_23_23.624416", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T00-23-23.624416.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T00-23-23.624416.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_02T00_23_23.624416", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-02T00-23-23.624416.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-02T00-23-23.624416.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_02T00_23_23.624416", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T00-23-23.624416.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T00-23-23.624416.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_02T00_23_23.624416", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T00-23-23.624416.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T00-23-23.624416.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_02T00_23_23.624416", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T00-23-23.624416.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T00-23-23.624416.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_02T00_23_23.624416", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-02T00-23-23.624416.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-02T00-23-23.624416.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_02T00_23_23.624416", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-02T00-23-23.624416.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-02T00-23-23.624416.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_02T00_23_23.624416", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T00-23-23.624416.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T00-23-23.624416.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_02T00_23_23.624416", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T00-23-23.624416.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T00-23-23.624416.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_02T00_23_23.624416", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T00-23-23.624416.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T00-23-23.624416.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_02T00_23_23.624416", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T00-23-23.624416.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T00-23-23.624416.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_02T00_23_23.624416", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-02T00-23-23.624416.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-02T00-23-23.624416.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_02T00_23_23.624416", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-02T00-23-23.624416.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-02T00-23-23.624416.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_02T00_23_23.624416", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-02T00-23-23.624416.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-02T00-23-23.624416.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_02T00_23_23.624416", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T00-23-23.624416.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T00-23-23.624416.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_02T00_23_23.624416", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-02T00-23-23.624416.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-02T00-23-23.624416.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_02T00_23_23.624416", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T00-23-23.624416.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T00-23-23.624416.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_02T00_23_23.624416", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T00-23-23.624416.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T00-23-23.624416.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_02T00_23_23.624416", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-02T00-23-23.624416.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-02T00-23-23.624416.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_02T00_23_23.624416", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-02T00-23-23.624416.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-02T00-23-23.624416.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_02T00_23_23.624416", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-02T00-23-23.624416.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-02T00-23-23.624416.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_02T00_23_23.624416", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T00-23-23.624416.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T00-23-23.624416.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_02T00_23_23.624416", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-02T00-23-23.624416.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-02T00-23-23.624416.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_02T00_23_23.624416", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-02T00-23-23.624416.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-02T00-23-23.624416.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_02T00_23_23.624416", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-02T00-23-23.624416.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-02T00-23-23.624416.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_02T00_23_23.624416", "path": ["**/details_harness|winogrande|5_2024-02-02T00-23-23.624416.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-02T00-23-23.624416.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_02T00_23_23.624416", "path": ["results_2024-02-02T00-23-23.624416.parquet"]}, {"split": "latest", "path": ["results_2024-02-02T00-23-23.624416.parquet"]}]}]}
2024-02-02T00:26:09+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of ArianAskari/NeuralHermes-2.5-Mistral-7B Dataset automatically created during the evaluation run of model ArianAskari/NeuralHermes-2.5-Mistral-7B on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-02-02T00:23:23.624416(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of ArianAskari/NeuralHermes-2.5-Mistral-7B\n\n\n\nDataset automatically created during the evaluation run of model ArianAskari/NeuralHermes-2.5-Mistral-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-02-02T00:23:23.624416(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of ArianAskari/NeuralHermes-2.5-Mistral-7B\n\n\n\nDataset automatically created during the evaluation run of model ArianAskari/NeuralHermes-2.5-Mistral-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-02-02T00:23:23.624416(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
3775ddcdb00ac71fe077de01fa34d662171d24be
**QUILT-LLaVA Visual Instruct 107K Dataset Card** **Paper: Quilt-LLaVA: Visual Instruction Tuning by Extracting Localized Narratives from Open-Source Histopathology Videos** **Paper or resources for more information:** https://quilt-llava.github.io/ <p align="center"> <img src="https://quilt-llava.github.io/static/images/clusters2.png" alt="fig2" width="90%"/> </p> **Description and Details** 1. YouTube educational histopathology videos are a valuable source of grounded histopathology data for instructional purposes, particularly for visual instruction tuning. 2. Similar to LLaVA, the approach involves using independent prompts for generating Q&A pairs from image captions using GPT-4. In contrast to LLaVA-Med, this approach adds spatial grounding by extracting mouse pointers to link narrator's speech to specific regions of images, improving spatial awareness. 3. Traditional image-caption datasets often lack contextual connections, limiting Q/A pairs generated by GPT-4 to the context of a single image. For histopathology, which requires holistic analysis, the proposal suggests reasoning-based prompting techniques. These techniques include Complex Reasoning, where GPT-4 uses diagnosis and contributory facts to extrapolate beyond the immediate context, and Iterative Abductive Reasoning, which simulates a conversation between two GPT-4 agents for in-depth medical questioning and evaluation. 4. In Complex Reasoning, GPT-4 is prompted with a caption, diagnosis, and facts to perform diagnostic reasoning that goes beyond the single image context. 5. In Iterative Abductive Reasoning, a conversation is simulated between two GPT-4 agents: Human-GPT, provided with a single image caption for abductive reasoning, and AI Assistant GPT, which has access to diagnosis and facts to provide feedback, resembling a professional medical consultation. This iterative process continues until a conclusion is reached. <p align="center"> <img src="https://quilt-llava.github.io/static/images/iterative_1.png" alt="fig2" width="90%"/> </p> **Dataset date:** QUILT-LLaVA Visual Instruct 107K was collected in November 2023, by prompting GPT-4-turbo API. **License:** MIT License; and it should abide by the policy of OpenAI: https://openai.com/policies/terms-of-use **Where to send questions or comments about the model:** https://github.com/quilt-llava/quilt-llava.github.io/issues **Primary intended uses:** The primary use of QUILT-LLaVA is research on histopathology large multimodal models and chatbots. **Primary intended users:** The dataset is intended as a research resource for research communities. We hope that this dataset will enable researchers to better understand and explore the generative capacity of medical large multimodal models **Citation** ```bibtex @misc{seyfioglu2023quiltllava, title={Quilt-LLaVA: Visual Instruction Tuning by Extracting Localized Narratives from Open-Source Histopathology Videos}, author={Mehmet Saygin Seyfioglu and Wisdom O. Ikezogwo and Fatemeh Ghezloo and Ranjay Krishna and Linda Shapiro}, year={2023}, eprint={2312.04746}, archivePrefix={arXiv}, primaryClass={cs.CV} } ``` ```bibtex @misc{ikezogwo2023quilt1m, title={Quilt-1M: One Million Image-Text Pairs for Histopathology}, author={Wisdom Oluchi Ikezogwo and Mehmet Saygin Seyfioglu and Fatemeh Ghezloo and Dylan Stefan Chan Geva and Fatwir Sheikh Mohammed and Pavan Kumar Anand and Ranjay Krishna and Linda Shapiro}, year={2023}, eprint={2306.11207}, archivePrefix={arXiv}, primaryClass={cs.CV} } ```
wisdomik/QUILT-LLaVA-Instruct-107K
[ "task_categories:visual-question-answering", "task_categories:question-answering", "size_categories:100K<n<1M", "language:en", "license:mit", "arxiv:2312.04746", "arxiv:2306.11207", "region:us" ]
2024-02-02T00:26:08+00:00
{"language": ["en"], "license": "mit", "size_categories": ["100K<n<1M"], "task_categories": ["visual-question-answering", "question-answering"], "pretty_name": "QUILT-LLaVA Visual Instruct 107K", "extra_gated_prompt": "Please read and agree to the following terms: 1. The requester details provided are not faked. 2. The resource will not be used for commercial/clinical purposes and will be used for scientific research only. 3. The data will not be re-distributed, published, copied, or further disseminated in any way or form whatsoever, whether for profit or not. 4. The right study/paper (Quilt-1M(https://quilt1m.github.io/) and Quilt-LLaVa (https://quilt-llava.github.io) papers) will be cited in any publication(s) that uses this model/data ", "extra_gated_fields": {"Email": "text", "First and last name": "text", "Affiliation": "text", "Type of Affiliation": {"type": "select", "options": ["Academia", "Industry", "Other"]}, "I want to use this model for": {"type": "select", "options": ["Research", "Education", {"label": "Other", "value": "other"}]}, "I agree to the aforementioned terms of use": "checkbox"}}
2024-02-14T21:49:16+00:00
[ "2312.04746", "2306.11207" ]
[ "en" ]
TAGS #task_categories-visual-question-answering #task_categories-question-answering #size_categories-100K<n<1M #language-English #license-mit #arxiv-2312.04746 #arxiv-2306.11207 #region-us
QUILT-LLaVA Visual Instruct 107K Dataset Card Paper: Quilt-LLaVA: Visual Instruction Tuning by Extracting Localized Narratives from Open-Source Histopathology Videos Paper or resources for more information: URL <p align="center"> <img src="URL alt="fig2" width="90%"/> </p> Description and Details 1. YouTube educational histopathology videos are a valuable source of grounded histopathology data for instructional purposes, particularly for visual instruction tuning. 2. Similar to LLaVA, the approach involves using independent prompts for generating Q&A pairs from image captions using GPT-4. In contrast to LLaVA-Med, this approach adds spatial grounding by extracting mouse pointers to link narrator's speech to specific regions of images, improving spatial awareness. 3. Traditional image-caption datasets often lack contextual connections, limiting Q/A pairs generated by GPT-4 to the context of a single image. For histopathology, which requires holistic analysis, the proposal suggests reasoning-based prompting techniques. These techniques include Complex Reasoning, where GPT-4 uses diagnosis and contributory facts to extrapolate beyond the immediate context, and Iterative Abductive Reasoning, which simulates a conversation between two GPT-4 agents for in-depth medical questioning and evaluation. 4. In Complex Reasoning, GPT-4 is prompted with a caption, diagnosis, and facts to perform diagnostic reasoning that goes beyond the single image context. 5. In Iterative Abductive Reasoning, a conversation is simulated between two GPT-4 agents: Human-GPT, provided with a single image caption for abductive reasoning, and AI Assistant GPT, which has access to diagnosis and facts to provide feedback, resembling a professional medical consultation. This iterative process continues until a conclusion is reached. <p align="center"> <img src="URL alt="fig2" width="90%"/> </p> Dataset date: QUILT-LLaVA Visual Instruct 107K was collected in November 2023, by prompting GPT-4-turbo API. License: MIT License; and it should abide by the policy of OpenAI: URL Where to send questions or comments about the model: URL Primary intended uses: The primary use of QUILT-LLaVA is research on histopathology large multimodal models and chatbots. Primary intended users: The dataset is intended as a research resource for research communities. We hope that this dataset will enable researchers to better understand and explore the generative capacity of medical large multimodal models Citation
[]
[ "TAGS\n#task_categories-visual-question-answering #task_categories-question-answering #size_categories-100K<n<1M #language-English #license-mit #arxiv-2312.04746 #arxiv-2306.11207 #region-us \n" ]
8f3e05a74874797f9e7c08c3b898b4cda3a246eb
# Dataset Card for Evaluation run of mlabonne/OmniBeagle-7B <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [mlabonne/OmniBeagle-7B](https://huggingface.co/mlabonne/OmniBeagle-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_mlabonne__OmniBeagle-7B", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-02-02T00:27:50.454931](https://huggingface.co/datasets/open-llm-leaderboard/details_mlabonne__OmniBeagle-7B/blob/main/results_2024-02-02T00-27-50.454931.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6537766004723544, "acc_stderr": 0.032072993471282146, "acc_norm": 0.6533686908979144, "acc_norm_stderr": 0.032738749422162323, "mc1": 0.591187270501836, "mc1_stderr": 0.017209952151641724, "mc2": 0.7444830471955781, "mc2_stderr": 0.014267191160465879 }, "harness|arc:challenge|25": { "acc": 0.7013651877133106, "acc_stderr": 0.013374078615068747, "acc_norm": 0.7261092150170648, "acc_norm_stderr": 0.013032004972989506 }, "harness|hellaswag|10": { "acc": 0.7140011949810795, "acc_stderr": 0.004509652679395677, "acc_norm": 0.8892650866361282, "acc_norm_stderr": 0.003131622628199086 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.31, "acc_stderr": 0.04648231987117316, "acc_norm": 0.31, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.6370370370370371, "acc_stderr": 0.041539484047423976, "acc_norm": 0.6370370370370371, "acc_norm_stderr": 0.041539484047423976 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.6842105263157895, "acc_stderr": 0.0378272898086547, "acc_norm": 0.6842105263157895, "acc_norm_stderr": 0.0378272898086547 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.64, "acc_stderr": 0.04824181513244218, "acc_norm": 0.64, "acc_norm_stderr": 0.04824181513244218 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.6981132075471698, "acc_stderr": 0.02825420034443866, "acc_norm": 0.6981132075471698, "acc_norm_stderr": 0.02825420034443866 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.7986111111111112, "acc_stderr": 0.03353647469713839, "acc_norm": 0.7986111111111112, "acc_norm_stderr": 0.03353647469713839 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.52, "acc_stderr": 0.050211673156867795, "acc_norm": 0.52, "acc_norm_stderr": 0.050211673156867795 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.57, "acc_stderr": 0.04975698519562428, "acc_norm": 0.57, "acc_norm_stderr": 0.04975698519562428 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.32, "acc_stderr": 0.046882617226215034, "acc_norm": 0.32, "acc_norm_stderr": 0.046882617226215034 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6589595375722543, "acc_stderr": 0.03614665424180826, "acc_norm": 0.6589595375722543, "acc_norm_stderr": 0.03614665424180826 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.4117647058823529, "acc_stderr": 0.048971049527263666, "acc_norm": 0.4117647058823529, "acc_norm_stderr": 0.048971049527263666 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.74, "acc_stderr": 0.04408440022768078, "acc_norm": 0.74, "acc_norm_stderr": 0.04408440022768078 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.5787234042553191, "acc_stderr": 0.03227834510146268, "acc_norm": 0.5787234042553191, "acc_norm_stderr": 0.03227834510146268 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.49122807017543857, "acc_stderr": 0.04702880432049615, "acc_norm": 0.49122807017543857, "acc_norm_stderr": 0.04702880432049615 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5448275862068965, "acc_stderr": 0.04149886942192117, "acc_norm": 0.5448275862068965, "acc_norm_stderr": 0.04149886942192117 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.4126984126984127, "acc_stderr": 0.025355741263055273, "acc_norm": 0.4126984126984127, "acc_norm_stderr": 0.025355741263055273 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.4444444444444444, "acc_stderr": 0.044444444444444495, "acc_norm": 0.4444444444444444, "acc_norm_stderr": 0.044444444444444495 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.35, "acc_stderr": 0.04793724854411019, "acc_norm": 0.35, "acc_norm_stderr": 0.04793724854411019 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.7870967741935484, "acc_stderr": 0.02328766512726855, "acc_norm": 0.7870967741935484, "acc_norm_stderr": 0.02328766512726855 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.4975369458128079, "acc_stderr": 0.03517945038691063, "acc_norm": 0.4975369458128079, "acc_norm_stderr": 0.03517945038691063 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.7, "acc_stderr": 0.046056618647183814, "acc_norm": 0.7, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7575757575757576, "acc_stderr": 0.03346409881055953, "acc_norm": 0.7575757575757576, "acc_norm_stderr": 0.03346409881055953 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.803030303030303, "acc_stderr": 0.028335609732463362, "acc_norm": 0.803030303030303, "acc_norm_stderr": 0.028335609732463362 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.9067357512953368, "acc_stderr": 0.02098685459328974, "acc_norm": 0.9067357512953368, "acc_norm_stderr": 0.02098685459328974 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.676923076923077, "acc_stderr": 0.02371088850197057, "acc_norm": 0.676923076923077, "acc_norm_stderr": 0.02371088850197057 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.32592592592592595, "acc_stderr": 0.02857834836547308, "acc_norm": 0.32592592592592595, "acc_norm_stderr": 0.02857834836547308 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.6890756302521008, "acc_stderr": 0.03006676158297793, "acc_norm": 0.6890756302521008, "acc_norm_stderr": 0.03006676158297793 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.39072847682119205, "acc_stderr": 0.039837983066598075, "acc_norm": 0.39072847682119205, "acc_norm_stderr": 0.039837983066598075 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8458715596330275, "acc_stderr": 0.015480826865374303, "acc_norm": 0.8458715596330275, "acc_norm_stderr": 0.015480826865374303 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.5416666666666666, "acc_stderr": 0.03398110890294636, "acc_norm": 0.5416666666666666, "acc_norm_stderr": 0.03398110890294636 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.8382352941176471, "acc_stderr": 0.02584501798692692, "acc_norm": 0.8382352941176471, "acc_norm_stderr": 0.02584501798692692 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.8059071729957806, "acc_stderr": 0.025744902532290902, "acc_norm": 0.8059071729957806, "acc_norm_stderr": 0.025744902532290902 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.695067264573991, "acc_stderr": 0.030898610882477515, "acc_norm": 0.695067264573991, "acc_norm_stderr": 0.030898610882477515 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.8015267175572519, "acc_stderr": 0.03498149385462472, "acc_norm": 0.8015267175572519, "acc_norm_stderr": 0.03498149385462472 }, "harness|hendrycksTest-international_law|5": { "acc": 0.7851239669421488, "acc_stderr": 0.037494924487096966, "acc_norm": 0.7851239669421488, "acc_norm_stderr": 0.037494924487096966 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7777777777777778, "acc_stderr": 0.0401910747255735, "acc_norm": 0.7777777777777778, "acc_norm_stderr": 0.0401910747255735 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7484662576687117, "acc_stderr": 0.03408997886857529, "acc_norm": 0.7484662576687117, "acc_norm_stderr": 0.03408997886857529 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.42857142857142855, "acc_stderr": 0.04697113923010212, "acc_norm": 0.42857142857142855, "acc_norm_stderr": 0.04697113923010212 }, "harness|hendrycksTest-management|5": { "acc": 0.7669902912621359, "acc_stderr": 0.04185832598928315, "acc_norm": 0.7669902912621359, "acc_norm_stderr": 0.04185832598928315 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8846153846153846, "acc_stderr": 0.020930193185179326, "acc_norm": 0.8846153846153846, "acc_norm_stderr": 0.020930193185179326 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.71, "acc_stderr": 0.045604802157206845, "acc_norm": 0.71, "acc_norm_stderr": 0.045604802157206845 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8237547892720306, "acc_stderr": 0.013625556907993464, "acc_norm": 0.8237547892720306, "acc_norm_stderr": 0.013625556907993464 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.7369942196531792, "acc_stderr": 0.023703099525258176, "acc_norm": 0.7369942196531792, "acc_norm_stderr": 0.023703099525258176 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.41787709497206704, "acc_stderr": 0.016495400635820084, "acc_norm": 0.41787709497206704, "acc_norm_stderr": 0.016495400635820084 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.7189542483660131, "acc_stderr": 0.025738854797818737, "acc_norm": 0.7189542483660131, "acc_norm_stderr": 0.025738854797818737 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.707395498392283, "acc_stderr": 0.02583989833487798, "acc_norm": 0.707395498392283, "acc_norm_stderr": 0.02583989833487798 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.7469135802469136, "acc_stderr": 0.024191808600712995, "acc_norm": 0.7469135802469136, "acc_norm_stderr": 0.024191808600712995 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.48226950354609927, "acc_stderr": 0.02980873964223777, "acc_norm": 0.48226950354609927, "acc_norm_stderr": 0.02980873964223777 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.47131681877444587, "acc_stderr": 0.012749206007657478, "acc_norm": 0.47131681877444587, "acc_norm_stderr": 0.012749206007657478 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.6838235294117647, "acc_stderr": 0.02824568739146293, "acc_norm": 0.6838235294117647, "acc_norm_stderr": 0.02824568739146293 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6764705882352942, "acc_stderr": 0.018926082916083383, "acc_norm": 0.6764705882352942, "acc_norm_stderr": 0.018926082916083383 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6636363636363637, "acc_stderr": 0.04525393596302506, "acc_norm": 0.6636363636363637, "acc_norm_stderr": 0.04525393596302506 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.7306122448979592, "acc_stderr": 0.02840125202902294, "acc_norm": 0.7306122448979592, "acc_norm_stderr": 0.02840125202902294 }, "harness|hendrycksTest-sociology|5": { "acc": 0.845771144278607, "acc_stderr": 0.02553843336857833, "acc_norm": 0.845771144278607, "acc_norm_stderr": 0.02553843336857833 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.85, "acc_stderr": 0.0358870281282637, "acc_norm": 0.85, "acc_norm_stderr": 0.0358870281282637 }, "harness|hendrycksTest-virology|5": { "acc": 0.5662650602409639, "acc_stderr": 0.03858158940685517, "acc_norm": 0.5662650602409639, "acc_norm_stderr": 0.03858158940685517 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8362573099415205, "acc_stderr": 0.028380919596145866, "acc_norm": 0.8362573099415205, "acc_norm_stderr": 0.028380919596145866 }, "harness|truthfulqa:mc|0": { "mc1": 0.591187270501836, "mc1_stderr": 0.017209952151641724, "mc2": 0.7444830471955781, "mc2_stderr": 0.014267191160465879 }, "harness|winogrande|5": { "acc": 0.8310970797158642, "acc_stderr": 0.010529981411838906 }, "harness|gsm8k|5": { "acc": 0.7005307050796058, "acc_stderr": 0.012616300735519649 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_mlabonne__OmniBeagle-7B
[ "region:us" ]
2024-02-02T00:30:07+00:00
{"pretty_name": "Evaluation run of mlabonne/OmniBeagle-7B", "dataset_summary": "Dataset automatically created during the evaluation run of model [mlabonne/OmniBeagle-7B](https://huggingface.co/mlabonne/OmniBeagle-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_mlabonne__OmniBeagle-7B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-02T00:27:50.454931](https://huggingface.co/datasets/open-llm-leaderboard/details_mlabonne__OmniBeagle-7B/blob/main/results_2024-02-02T00-27-50.454931.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6537766004723544,\n \"acc_stderr\": 0.032072993471282146,\n \"acc_norm\": 0.6533686908979144,\n \"acc_norm_stderr\": 0.032738749422162323,\n \"mc1\": 0.591187270501836,\n \"mc1_stderr\": 0.017209952151641724,\n \"mc2\": 0.7444830471955781,\n \"mc2_stderr\": 0.014267191160465879\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.7013651877133106,\n \"acc_stderr\": 0.013374078615068747,\n \"acc_norm\": 0.7261092150170648,\n \"acc_norm_stderr\": 0.013032004972989506\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7140011949810795,\n \"acc_stderr\": 0.004509652679395677,\n \"acc_norm\": 0.8892650866361282,\n \"acc_norm_stderr\": 0.003131622628199086\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6370370370370371,\n \"acc_stderr\": 0.041539484047423976,\n \"acc_norm\": 0.6370370370370371,\n \"acc_norm_stderr\": 0.041539484047423976\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6842105263157895,\n \"acc_stderr\": 0.0378272898086547,\n \"acc_norm\": 0.6842105263157895,\n \"acc_norm_stderr\": 0.0378272898086547\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6981132075471698,\n \"acc_stderr\": 0.02825420034443866,\n \"acc_norm\": 0.6981132075471698,\n \"acc_norm_stderr\": 0.02825420034443866\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7986111111111112,\n \"acc_stderr\": 0.03353647469713839,\n \"acc_norm\": 0.7986111111111112,\n \"acc_norm_stderr\": 0.03353647469713839\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.57,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6589595375722543,\n \"acc_stderr\": 0.03614665424180826,\n \"acc_norm\": 0.6589595375722543,\n \"acc_norm_stderr\": 0.03614665424180826\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.048971049527263666,\n \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.048971049527263666\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5787234042553191,\n \"acc_stderr\": 0.03227834510146268,\n \"acc_norm\": 0.5787234042553191,\n \"acc_norm_stderr\": 0.03227834510146268\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.49122807017543857,\n \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.49122807017543857,\n \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5448275862068965,\n \"acc_stderr\": 0.04149886942192117,\n \"acc_norm\": 0.5448275862068965,\n \"acc_norm_stderr\": 0.04149886942192117\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.4126984126984127,\n \"acc_stderr\": 0.025355741263055273,\n \"acc_norm\": 0.4126984126984127,\n \"acc_norm_stderr\": 0.025355741263055273\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4444444444444444,\n \"acc_stderr\": 0.044444444444444495,\n \"acc_norm\": 0.4444444444444444,\n \"acc_norm_stderr\": 0.044444444444444495\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.04793724854411019,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.04793724854411019\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7870967741935484,\n \"acc_stderr\": 0.02328766512726855,\n \"acc_norm\": 0.7870967741935484,\n \"acc_norm_stderr\": 0.02328766512726855\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4975369458128079,\n \"acc_stderr\": 0.03517945038691063,\n \"acc_norm\": 0.4975369458128079,\n \"acc_norm_stderr\": 0.03517945038691063\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7575757575757576,\n \"acc_stderr\": 0.03346409881055953,\n \"acc_norm\": 0.7575757575757576,\n \"acc_norm_stderr\": 0.03346409881055953\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.803030303030303,\n \"acc_stderr\": 0.028335609732463362,\n \"acc_norm\": 0.803030303030303,\n \"acc_norm_stderr\": 0.028335609732463362\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9067357512953368,\n \"acc_stderr\": 0.02098685459328974,\n \"acc_norm\": 0.9067357512953368,\n \"acc_norm_stderr\": 0.02098685459328974\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.676923076923077,\n \"acc_stderr\": 0.02371088850197057,\n \"acc_norm\": 0.676923076923077,\n \"acc_norm_stderr\": 0.02371088850197057\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.32592592592592595,\n \"acc_stderr\": 0.02857834836547308,\n \"acc_norm\": 0.32592592592592595,\n \"acc_norm_stderr\": 0.02857834836547308\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6890756302521008,\n \"acc_stderr\": 0.03006676158297793,\n \"acc_norm\": 0.6890756302521008,\n \"acc_norm_stderr\": 0.03006676158297793\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.39072847682119205,\n \"acc_stderr\": 0.039837983066598075,\n \"acc_norm\": 0.39072847682119205,\n \"acc_norm_stderr\": 0.039837983066598075\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8458715596330275,\n \"acc_stderr\": 0.015480826865374303,\n \"acc_norm\": 0.8458715596330275,\n \"acc_norm_stderr\": 0.015480826865374303\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5416666666666666,\n \"acc_stderr\": 0.03398110890294636,\n \"acc_norm\": 0.5416666666666666,\n \"acc_norm_stderr\": 0.03398110890294636\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8382352941176471,\n \"acc_stderr\": 0.02584501798692692,\n \"acc_norm\": 0.8382352941176471,\n \"acc_norm_stderr\": 0.02584501798692692\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8059071729957806,\n \"acc_stderr\": 0.025744902532290902,\n \"acc_norm\": 0.8059071729957806,\n \"acc_norm_stderr\": 0.025744902532290902\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.695067264573991,\n \"acc_stderr\": 0.030898610882477515,\n \"acc_norm\": 0.695067264573991,\n \"acc_norm_stderr\": 0.030898610882477515\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8015267175572519,\n \"acc_stderr\": 0.03498149385462472,\n \"acc_norm\": 0.8015267175572519,\n \"acc_norm_stderr\": 0.03498149385462472\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.0401910747255735,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.0401910747255735\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7484662576687117,\n \"acc_stderr\": 0.03408997886857529,\n \"acc_norm\": 0.7484662576687117,\n \"acc_norm_stderr\": 0.03408997886857529\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.42857142857142855,\n \"acc_stderr\": 0.04697113923010212,\n \"acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.04697113923010212\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8846153846153846,\n \"acc_stderr\": 0.020930193185179326,\n \"acc_norm\": 0.8846153846153846,\n \"acc_norm_stderr\": 0.020930193185179326\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8237547892720306,\n \"acc_stderr\": 0.013625556907993464,\n \"acc_norm\": 0.8237547892720306,\n \"acc_norm_stderr\": 0.013625556907993464\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7369942196531792,\n \"acc_stderr\": 0.023703099525258176,\n \"acc_norm\": 0.7369942196531792,\n \"acc_norm_stderr\": 0.023703099525258176\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.41787709497206704,\n \"acc_stderr\": 0.016495400635820084,\n \"acc_norm\": 0.41787709497206704,\n \"acc_norm_stderr\": 0.016495400635820084\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7189542483660131,\n \"acc_stderr\": 0.025738854797818737,\n \"acc_norm\": 0.7189542483660131,\n \"acc_norm_stderr\": 0.025738854797818737\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.707395498392283,\n \"acc_stderr\": 0.02583989833487798,\n \"acc_norm\": 0.707395498392283,\n \"acc_norm_stderr\": 0.02583989833487798\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7469135802469136,\n \"acc_stderr\": 0.024191808600712995,\n \"acc_norm\": 0.7469135802469136,\n \"acc_norm_stderr\": 0.024191808600712995\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.48226950354609927,\n \"acc_stderr\": 0.02980873964223777,\n \"acc_norm\": 0.48226950354609927,\n \"acc_norm_stderr\": 0.02980873964223777\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.47131681877444587,\n \"acc_stderr\": 0.012749206007657478,\n \"acc_norm\": 0.47131681877444587,\n \"acc_norm_stderr\": 0.012749206007657478\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6838235294117647,\n \"acc_stderr\": 0.02824568739146293,\n \"acc_norm\": 0.6838235294117647,\n \"acc_norm_stderr\": 0.02824568739146293\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.018926082916083383,\n \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.018926082916083383\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7306122448979592,\n \"acc_stderr\": 0.02840125202902294,\n \"acc_norm\": 0.7306122448979592,\n \"acc_norm_stderr\": 0.02840125202902294\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.845771144278607,\n \"acc_stderr\": 0.02553843336857833,\n \"acc_norm\": 0.845771144278607,\n \"acc_norm_stderr\": 0.02553843336857833\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.85,\n \"acc_stderr\": 0.0358870281282637,\n \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.0358870281282637\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5662650602409639,\n \"acc_stderr\": 0.03858158940685517,\n \"acc_norm\": 0.5662650602409639,\n \"acc_norm_stderr\": 0.03858158940685517\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.591187270501836,\n \"mc1_stderr\": 0.017209952151641724,\n \"mc2\": 0.7444830471955781,\n \"mc2_stderr\": 0.014267191160465879\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8310970797158642,\n \"acc_stderr\": 0.010529981411838906\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7005307050796058,\n \"acc_stderr\": 0.012616300735519649\n }\n}\n```", "repo_url": "https://huggingface.co/mlabonne/OmniBeagle-7B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_02T00_27_50.454931", "path": ["**/details_harness|arc:challenge|25_2024-02-02T00-27-50.454931.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-02T00-27-50.454931.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_02T00_27_50.454931", "path": ["**/details_harness|gsm8k|5_2024-02-02T00-27-50.454931.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-02T00-27-50.454931.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_02T00_27_50.454931", "path": ["**/details_harness|hellaswag|10_2024-02-02T00-27-50.454931.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-02T00-27-50.454931.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_02T00_27_50.454931", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T00-27-50.454931.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-02T00-27-50.454931.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-02T00-27-50.454931.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T00-27-50.454931.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T00-27-50.454931.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-02T00-27-50.454931.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T00-27-50.454931.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T00-27-50.454931.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T00-27-50.454931.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T00-27-50.454931.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-02T00-27-50.454931.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-02T00-27-50.454931.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T00-27-50.454931.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-02T00-27-50.454931.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T00-27-50.454931.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T00-27-50.454931.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T00-27-50.454931.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-02T00-27-50.454931.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T00-27-50.454931.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T00-27-50.454931.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T00-27-50.454931.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T00-27-50.454931.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T00-27-50.454931.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T00-27-50.454931.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T00-27-50.454931.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T00-27-50.454931.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T00-27-50.454931.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T00-27-50.454931.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T00-27-50.454931.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T00-27-50.454931.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T00-27-50.454931.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T00-27-50.454931.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-02T00-27-50.454931.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T00-27-50.454931.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-02T00-27-50.454931.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T00-27-50.454931.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T00-27-50.454931.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T00-27-50.454931.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-02T00-27-50.454931.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-02T00-27-50.454931.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T00-27-50.454931.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T00-27-50.454931.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T00-27-50.454931.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T00-27-50.454931.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-02T00-27-50.454931.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-02T00-27-50.454931.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-02T00-27-50.454931.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T00-27-50.454931.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-02T00-27-50.454931.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T00-27-50.454931.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T00-27-50.454931.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-02T00-27-50.454931.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-02T00-27-50.454931.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-02T00-27-50.454931.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T00-27-50.454931.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-02T00-27-50.454931.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-02T00-27-50.454931.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T00-27-50.454931.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-02T00-27-50.454931.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-02T00-27-50.454931.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T00-27-50.454931.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T00-27-50.454931.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-02T00-27-50.454931.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T00-27-50.454931.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T00-27-50.454931.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T00-27-50.454931.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T00-27-50.454931.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-02T00-27-50.454931.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-02T00-27-50.454931.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T00-27-50.454931.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-02T00-27-50.454931.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T00-27-50.454931.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T00-27-50.454931.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T00-27-50.454931.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-02T00-27-50.454931.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T00-27-50.454931.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T00-27-50.454931.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T00-27-50.454931.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T00-27-50.454931.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T00-27-50.454931.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T00-27-50.454931.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T00-27-50.454931.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T00-27-50.454931.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T00-27-50.454931.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T00-27-50.454931.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T00-27-50.454931.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T00-27-50.454931.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T00-27-50.454931.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T00-27-50.454931.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-02T00-27-50.454931.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T00-27-50.454931.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-02T00-27-50.454931.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T00-27-50.454931.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T00-27-50.454931.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T00-27-50.454931.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-02T00-27-50.454931.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-02T00-27-50.454931.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T00-27-50.454931.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T00-27-50.454931.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T00-27-50.454931.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T00-27-50.454931.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-02T00-27-50.454931.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-02T00-27-50.454931.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-02T00-27-50.454931.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T00-27-50.454931.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-02T00-27-50.454931.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T00-27-50.454931.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T00-27-50.454931.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-02T00-27-50.454931.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-02T00-27-50.454931.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-02T00-27-50.454931.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T00-27-50.454931.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-02T00-27-50.454931.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-02T00-27-50.454931.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_02T00_27_50.454931", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T00-27-50.454931.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T00-27-50.454931.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_02T00_27_50.454931", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-02T00-27-50.454931.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-02T00-27-50.454931.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_02T00_27_50.454931", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-02T00-27-50.454931.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-02T00-27-50.454931.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_02T00_27_50.454931", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T00-27-50.454931.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T00-27-50.454931.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_02T00_27_50.454931", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T00-27-50.454931.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T00-27-50.454931.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_02T00_27_50.454931", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-02T00-27-50.454931.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-02T00-27-50.454931.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_02T00_27_50.454931", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T00-27-50.454931.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T00-27-50.454931.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_02T00_27_50.454931", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T00-27-50.454931.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T00-27-50.454931.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_02T00_27_50.454931", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T00-27-50.454931.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T00-27-50.454931.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_02T00_27_50.454931", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T00-27-50.454931.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T00-27-50.454931.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_02T00_27_50.454931", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-02T00-27-50.454931.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-02T00-27-50.454931.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_02T00_27_50.454931", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-02T00-27-50.454931.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-02T00-27-50.454931.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_02T00_27_50.454931", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T00-27-50.454931.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T00-27-50.454931.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_02T00_27_50.454931", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-02T00-27-50.454931.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-02T00-27-50.454931.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_02T00_27_50.454931", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T00-27-50.454931.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T00-27-50.454931.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_02T00_27_50.454931", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T00-27-50.454931.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T00-27-50.454931.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_02T00_27_50.454931", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T00-27-50.454931.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T00-27-50.454931.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_02T00_27_50.454931", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-02T00-27-50.454931.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-02T00-27-50.454931.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_02T00_27_50.454931", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T00-27-50.454931.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T00-27-50.454931.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_02T00_27_50.454931", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T00-27-50.454931.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T00-27-50.454931.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_02T00_27_50.454931", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T00-27-50.454931.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T00-27-50.454931.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_02T00_27_50.454931", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T00-27-50.454931.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T00-27-50.454931.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_02T00_27_50.454931", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T00-27-50.454931.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T00-27-50.454931.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_02T00_27_50.454931", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T00-27-50.454931.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T00-27-50.454931.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_02T00_27_50.454931", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T00-27-50.454931.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T00-27-50.454931.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_02T00_27_50.454931", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T00-27-50.454931.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T00-27-50.454931.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_02T00_27_50.454931", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T00-27-50.454931.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T00-27-50.454931.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_02T00_27_50.454931", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T00-27-50.454931.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T00-27-50.454931.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_02T00_27_50.454931", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T00-27-50.454931.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T00-27-50.454931.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_02T00_27_50.454931", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T00-27-50.454931.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T00-27-50.454931.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_02T00_27_50.454931", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T00-27-50.454931.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T00-27-50.454931.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_02T00_27_50.454931", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T00-27-50.454931.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T00-27-50.454931.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_02T00_27_50.454931", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-02T00-27-50.454931.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-02T00-27-50.454931.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_02T00_27_50.454931", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T00-27-50.454931.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T00-27-50.454931.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_02T00_27_50.454931", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-02T00-27-50.454931.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-02T00-27-50.454931.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_02T00_27_50.454931", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T00-27-50.454931.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T00-27-50.454931.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_02T00_27_50.454931", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T00-27-50.454931.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T00-27-50.454931.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_02T00_27_50.454931", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T00-27-50.454931.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T00-27-50.454931.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_02T00_27_50.454931", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-02T00-27-50.454931.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-02T00-27-50.454931.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_02T00_27_50.454931", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-02T00-27-50.454931.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-02T00-27-50.454931.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_02T00_27_50.454931", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T00-27-50.454931.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T00-27-50.454931.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_02T00_27_50.454931", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T00-27-50.454931.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T00-27-50.454931.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_02T00_27_50.454931", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T00-27-50.454931.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T00-27-50.454931.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_02T00_27_50.454931", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T00-27-50.454931.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T00-27-50.454931.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_02T00_27_50.454931", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-02T00-27-50.454931.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-02T00-27-50.454931.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_02T00_27_50.454931", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-02T00-27-50.454931.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-02T00-27-50.454931.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_02T00_27_50.454931", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-02T00-27-50.454931.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-02T00-27-50.454931.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_02T00_27_50.454931", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T00-27-50.454931.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T00-27-50.454931.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_02T00_27_50.454931", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-02T00-27-50.454931.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-02T00-27-50.454931.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_02T00_27_50.454931", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T00-27-50.454931.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T00-27-50.454931.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_02T00_27_50.454931", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T00-27-50.454931.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T00-27-50.454931.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_02T00_27_50.454931", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-02T00-27-50.454931.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-02T00-27-50.454931.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_02T00_27_50.454931", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-02T00-27-50.454931.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-02T00-27-50.454931.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_02T00_27_50.454931", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-02T00-27-50.454931.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-02T00-27-50.454931.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_02T00_27_50.454931", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T00-27-50.454931.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T00-27-50.454931.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_02T00_27_50.454931", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-02T00-27-50.454931.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-02T00-27-50.454931.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_02T00_27_50.454931", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-02T00-27-50.454931.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-02T00-27-50.454931.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_02T00_27_50.454931", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-02T00-27-50.454931.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-02T00-27-50.454931.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_02T00_27_50.454931", "path": ["**/details_harness|winogrande|5_2024-02-02T00-27-50.454931.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-02T00-27-50.454931.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_02T00_27_50.454931", "path": ["results_2024-02-02T00-27-50.454931.parquet"]}, {"split": "latest", "path": ["results_2024-02-02T00-27-50.454931.parquet"]}]}]}
2024-02-02T00:30:44+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of mlabonne/OmniBeagle-7B Dataset automatically created during the evaluation run of model mlabonne/OmniBeagle-7B on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-02-02T00:27:50.454931(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of mlabonne/OmniBeagle-7B\n\n\n\nDataset automatically created during the evaluation run of model mlabonne/OmniBeagle-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-02-02T00:27:50.454931(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of mlabonne/OmniBeagle-7B\n\n\n\nDataset automatically created during the evaluation run of model mlabonne/OmniBeagle-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-02-02T00:27:50.454931(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
b7359e812832797a15a6a529a9722dcf047983ff
# Dataset Card for Evaluation run of jsfs11/MixtureofMerges-MoE-v2 <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [jsfs11/MixtureofMerges-MoE-v2](https://huggingface.co/jsfs11/MixtureofMerges-MoE-v2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_jsfs11__MixtureofMerges-MoE-v2", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-02-02T00:33:54.387134](https://huggingface.co/datasets/open-llm-leaderboard/details_jsfs11__MixtureofMerges-MoE-v2/blob/main/results_2024-02-02T00-33-54.387134.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6543220046149792, "acc_stderr": 0.032031600560374206, "acc_norm": 0.6541067503001904, "acc_norm_stderr": 0.03269555640627793, "mc1": 0.5667074663402693, "mc1_stderr": 0.017347024450107485, "mc2": 0.7091803983210125, "mc2_stderr": 0.01482201181219182 }, "harness|arc:challenge|25": { "acc": 0.697098976109215, "acc_stderr": 0.013428241573185349, "acc_norm": 0.7244027303754266, "acc_norm_stderr": 0.01305716965576184 }, "harness|hellaswag|10": { "acc": 0.710017924716192, "acc_stderr": 0.004528264116475881, "acc_norm": 0.8840868352917746, "acc_norm_stderr": 0.0031946652660786025 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.35, "acc_stderr": 0.047937248544110196, "acc_norm": 0.35, "acc_norm_stderr": 0.047937248544110196 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.6370370370370371, "acc_stderr": 0.041539484047423976, "acc_norm": 0.6370370370370371, "acc_norm_stderr": 0.041539484047423976 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.6973684210526315, "acc_stderr": 0.03738520676119669, "acc_norm": 0.6973684210526315, "acc_norm_stderr": 0.03738520676119669 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.65, "acc_stderr": 0.0479372485441102, "acc_norm": 0.65, "acc_norm_stderr": 0.0479372485441102 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.720754716981132, "acc_stderr": 0.027611163402399715, "acc_norm": 0.720754716981132, "acc_norm_stderr": 0.027611163402399715 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.7777777777777778, "acc_stderr": 0.03476590104304134, "acc_norm": 0.7777777777777778, "acc_norm_stderr": 0.03476590104304134 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.5, "acc_stderr": 0.050251890762960605, "acc_norm": 0.5, "acc_norm_stderr": 0.050251890762960605 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.51, "acc_stderr": 0.05024183937956911, "acc_norm": 0.51, "acc_norm_stderr": 0.05024183937956911 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.33, "acc_stderr": 0.047258156262526045, "acc_norm": 0.33, "acc_norm_stderr": 0.047258156262526045 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6647398843930635, "acc_stderr": 0.03599586301247077, "acc_norm": 0.6647398843930635, "acc_norm_stderr": 0.03599586301247077 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.4019607843137255, "acc_stderr": 0.04878608714466996, "acc_norm": 0.4019607843137255, "acc_norm_stderr": 0.04878608714466996 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.75, "acc_stderr": 0.04351941398892446, "acc_norm": 0.75, "acc_norm_stderr": 0.04351941398892446 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.5829787234042553, "acc_stderr": 0.03223276266711712, "acc_norm": 0.5829787234042553, "acc_norm_stderr": 0.03223276266711712 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.4649122807017544, "acc_stderr": 0.046920083813689104, "acc_norm": 0.4649122807017544, "acc_norm_stderr": 0.046920083813689104 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5586206896551724, "acc_stderr": 0.04137931034482757, "acc_norm": 0.5586206896551724, "acc_norm_stderr": 0.04137931034482757 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.4312169312169312, "acc_stderr": 0.025506481698138208, "acc_norm": 0.4312169312169312, "acc_norm_stderr": 0.025506481698138208 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.4603174603174603, "acc_stderr": 0.04458029125470973, "acc_norm": 0.4603174603174603, "acc_norm_stderr": 0.04458029125470973 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.33, "acc_stderr": 0.047258156262526045, "acc_norm": 0.33, "acc_norm_stderr": 0.047258156262526045 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.7935483870967742, "acc_stderr": 0.023025899617188723, "acc_norm": 0.7935483870967742, "acc_norm_stderr": 0.023025899617188723 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.4876847290640394, "acc_stderr": 0.035169204442208966, "acc_norm": 0.4876847290640394, "acc_norm_stderr": 0.035169204442208966 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.69, "acc_stderr": 0.04648231987117316, "acc_norm": 0.69, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7636363636363637, "acc_stderr": 0.03317505930009182, "acc_norm": 0.7636363636363637, "acc_norm_stderr": 0.03317505930009182 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.7828282828282829, "acc_stderr": 0.029376616484945633, "acc_norm": 0.7828282828282829, "acc_norm_stderr": 0.029376616484945633 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.9015544041450777, "acc_stderr": 0.021500249576033456, "acc_norm": 0.9015544041450777, "acc_norm_stderr": 0.021500249576033456 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.6641025641025641, "acc_stderr": 0.023946724741563976, "acc_norm": 0.6641025641025641, "acc_norm_stderr": 0.023946724741563976 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.3296296296296296, "acc_stderr": 0.02866120111652457, "acc_norm": 0.3296296296296296, "acc_norm_stderr": 0.02866120111652457 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.6890756302521008, "acc_stderr": 0.030066761582977945, "acc_norm": 0.6890756302521008, "acc_norm_stderr": 0.030066761582977945 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.3576158940397351, "acc_stderr": 0.03913453431177258, "acc_norm": 0.3576158940397351, "acc_norm_stderr": 0.03913453431177258 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8458715596330275, "acc_stderr": 0.015480826865374307, "acc_norm": 0.8458715596330275, "acc_norm_stderr": 0.015480826865374307 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.5324074074074074, "acc_stderr": 0.03402801581358966, "acc_norm": 0.5324074074074074, "acc_norm_stderr": 0.03402801581358966 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.8529411764705882, "acc_stderr": 0.024857478080250444, "acc_norm": 0.8529411764705882, "acc_norm_stderr": 0.024857478080250444 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.8059071729957806, "acc_stderr": 0.025744902532290902, "acc_norm": 0.8059071729957806, "acc_norm_stderr": 0.025744902532290902 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6860986547085202, "acc_stderr": 0.031146796482972465, "acc_norm": 0.6860986547085202, "acc_norm_stderr": 0.031146796482972465 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.7938931297709924, "acc_stderr": 0.03547771004159465, "acc_norm": 0.7938931297709924, "acc_norm_stderr": 0.03547771004159465 }, "harness|hendrycksTest-international_law|5": { "acc": 0.8016528925619835, "acc_stderr": 0.03640118271990946, "acc_norm": 0.8016528925619835, "acc_norm_stderr": 0.03640118271990946 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.75, "acc_stderr": 0.04186091791394607, "acc_norm": 0.75, "acc_norm_stderr": 0.04186091791394607 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.754601226993865, "acc_stderr": 0.03380939813943354, "acc_norm": 0.754601226993865, "acc_norm_stderr": 0.03380939813943354 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.44642857142857145, "acc_stderr": 0.04718471485219588, "acc_norm": 0.44642857142857145, "acc_norm_stderr": 0.04718471485219588 }, "harness|hendrycksTest-management|5": { "acc": 0.8058252427184466, "acc_stderr": 0.03916667762822584, "acc_norm": 0.8058252427184466, "acc_norm_stderr": 0.03916667762822584 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8846153846153846, "acc_stderr": 0.02093019318517933, "acc_norm": 0.8846153846153846, "acc_norm_stderr": 0.02093019318517933 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.74, "acc_stderr": 0.04408440022768078, "acc_norm": 0.74, "acc_norm_stderr": 0.04408440022768078 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.822477650063857, "acc_stderr": 0.01366423099583483, "acc_norm": 0.822477650063857, "acc_norm_stderr": 0.01366423099583483 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.7369942196531792, "acc_stderr": 0.023703099525258176, "acc_norm": 0.7369942196531792, "acc_norm_stderr": 0.023703099525258176 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.42793296089385474, "acc_stderr": 0.01654788799741611, "acc_norm": 0.42793296089385474, "acc_norm_stderr": 0.01654788799741611 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.7254901960784313, "acc_stderr": 0.025553169991826524, "acc_norm": 0.7254901960784313, "acc_norm_stderr": 0.025553169991826524 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.729903536977492, "acc_stderr": 0.02521804037341063, "acc_norm": 0.729903536977492, "acc_norm_stderr": 0.02521804037341063 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.7469135802469136, "acc_stderr": 0.024191808600712995, "acc_norm": 0.7469135802469136, "acc_norm_stderr": 0.024191808600712995 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.48936170212765956, "acc_stderr": 0.029820747191422473, "acc_norm": 0.48936170212765956, "acc_norm_stderr": 0.029820747191422473 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.47131681877444587, "acc_stderr": 0.012749206007657474, "acc_norm": 0.47131681877444587, "acc_norm_stderr": 0.012749206007657474 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.6764705882352942, "acc_stderr": 0.028418208619406755, "acc_norm": 0.6764705882352942, "acc_norm_stderr": 0.028418208619406755 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6813725490196079, "acc_stderr": 0.01885008469646872, "acc_norm": 0.6813725490196079, "acc_norm_stderr": 0.01885008469646872 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6909090909090909, "acc_stderr": 0.044262946482000985, "acc_norm": 0.6909090909090909, "acc_norm_stderr": 0.044262946482000985 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.7346938775510204, "acc_stderr": 0.028263889943784593, "acc_norm": 0.7346938775510204, "acc_norm_stderr": 0.028263889943784593 }, "harness|hendrycksTest-sociology|5": { "acc": 0.8258706467661692, "acc_stderr": 0.026814951200421603, "acc_norm": 0.8258706467661692, "acc_norm_stderr": 0.026814951200421603 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.85, "acc_stderr": 0.03588702812826371, "acc_norm": 0.85, "acc_norm_stderr": 0.03588702812826371 }, "harness|hendrycksTest-virology|5": { "acc": 0.5602409638554217, "acc_stderr": 0.03864139923699122, "acc_norm": 0.5602409638554217, "acc_norm_stderr": 0.03864139923699122 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8362573099415205, "acc_stderr": 0.028380919596145866, "acc_norm": 0.8362573099415205, "acc_norm_stderr": 0.028380919596145866 }, "harness|truthfulqa:mc|0": { "mc1": 0.5667074663402693, "mc1_stderr": 0.017347024450107485, "mc2": 0.7091803983210125, "mc2_stderr": 0.01482201181219182 }, "harness|winogrande|5": { "acc": 0.8358326756116812, "acc_stderr": 0.010410849775222789 }, "harness|gsm8k|5": { "acc": 0.6868840030326004, "acc_stderr": 0.012774285669385087 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_jsfs11__MixtureofMerges-MoE-v2
[ "region:us" ]
2024-02-02T00:36:13+00:00
{"pretty_name": "Evaluation run of jsfs11/MixtureofMerges-MoE-v2", "dataset_summary": "Dataset automatically created during the evaluation run of model [jsfs11/MixtureofMerges-MoE-v2](https://huggingface.co/jsfs11/MixtureofMerges-MoE-v2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_jsfs11__MixtureofMerges-MoE-v2\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-02T00:33:54.387134](https://huggingface.co/datasets/open-llm-leaderboard/details_jsfs11__MixtureofMerges-MoE-v2/blob/main/results_2024-02-02T00-33-54.387134.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6543220046149792,\n \"acc_stderr\": 0.032031600560374206,\n \"acc_norm\": 0.6541067503001904,\n \"acc_norm_stderr\": 0.03269555640627793,\n \"mc1\": 0.5667074663402693,\n \"mc1_stderr\": 0.017347024450107485,\n \"mc2\": 0.7091803983210125,\n \"mc2_stderr\": 0.01482201181219182\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.697098976109215,\n \"acc_stderr\": 0.013428241573185349,\n \"acc_norm\": 0.7244027303754266,\n \"acc_norm_stderr\": 0.01305716965576184\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.710017924716192,\n \"acc_stderr\": 0.004528264116475881,\n \"acc_norm\": 0.8840868352917746,\n \"acc_norm_stderr\": 0.0031946652660786025\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6370370370370371,\n \"acc_stderr\": 0.041539484047423976,\n \"acc_norm\": 0.6370370370370371,\n \"acc_norm_stderr\": 0.041539484047423976\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6973684210526315,\n \"acc_stderr\": 0.03738520676119669,\n \"acc_norm\": 0.6973684210526315,\n \"acc_norm_stderr\": 0.03738520676119669\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.65,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.720754716981132,\n \"acc_stderr\": 0.027611163402399715,\n \"acc_norm\": 0.720754716981132,\n \"acc_norm_stderr\": 0.027611163402399715\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.03476590104304134,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.03476590104304134\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956911,\n \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956911\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6647398843930635,\n \"acc_stderr\": 0.03599586301247077,\n \"acc_norm\": 0.6647398843930635,\n \"acc_norm_stderr\": 0.03599586301247077\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4019607843137255,\n \"acc_stderr\": 0.04878608714466996,\n \"acc_norm\": 0.4019607843137255,\n \"acc_norm_stderr\": 0.04878608714466996\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5829787234042553,\n \"acc_stderr\": 0.03223276266711712,\n \"acc_norm\": 0.5829787234042553,\n \"acc_norm_stderr\": 0.03223276266711712\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4649122807017544,\n \"acc_stderr\": 0.046920083813689104,\n \"acc_norm\": 0.4649122807017544,\n \"acc_norm_stderr\": 0.046920083813689104\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5586206896551724,\n \"acc_stderr\": 0.04137931034482757,\n \"acc_norm\": 0.5586206896551724,\n \"acc_norm_stderr\": 0.04137931034482757\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.4312169312169312,\n \"acc_stderr\": 0.025506481698138208,\n \"acc_norm\": 0.4312169312169312,\n \"acc_norm_stderr\": 0.025506481698138208\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4603174603174603,\n \"acc_stderr\": 0.04458029125470973,\n \"acc_norm\": 0.4603174603174603,\n \"acc_norm_stderr\": 0.04458029125470973\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7935483870967742,\n \"acc_stderr\": 0.023025899617188723,\n \"acc_norm\": 0.7935483870967742,\n \"acc_norm_stderr\": 0.023025899617188723\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4876847290640394,\n \"acc_stderr\": 0.035169204442208966,\n \"acc_norm\": 0.4876847290640394,\n \"acc_norm_stderr\": 0.035169204442208966\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7636363636363637,\n \"acc_stderr\": 0.03317505930009182,\n \"acc_norm\": 0.7636363636363637,\n \"acc_norm_stderr\": 0.03317505930009182\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7828282828282829,\n \"acc_stderr\": 0.029376616484945633,\n \"acc_norm\": 0.7828282828282829,\n \"acc_norm_stderr\": 0.029376616484945633\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9015544041450777,\n \"acc_stderr\": 0.021500249576033456,\n \"acc_norm\": 0.9015544041450777,\n \"acc_norm_stderr\": 0.021500249576033456\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6641025641025641,\n \"acc_stderr\": 0.023946724741563976,\n \"acc_norm\": 0.6641025641025641,\n \"acc_norm_stderr\": 0.023946724741563976\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3296296296296296,\n \"acc_stderr\": 0.02866120111652457,\n \"acc_norm\": 0.3296296296296296,\n \"acc_norm_stderr\": 0.02866120111652457\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6890756302521008,\n \"acc_stderr\": 0.030066761582977945,\n \"acc_norm\": 0.6890756302521008,\n \"acc_norm_stderr\": 0.030066761582977945\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3576158940397351,\n \"acc_stderr\": 0.03913453431177258,\n \"acc_norm\": 0.3576158940397351,\n \"acc_norm_stderr\": 0.03913453431177258\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8458715596330275,\n \"acc_stderr\": 0.015480826865374307,\n \"acc_norm\": 0.8458715596330275,\n \"acc_norm_stderr\": 0.015480826865374307\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5324074074074074,\n \"acc_stderr\": 0.03402801581358966,\n \"acc_norm\": 0.5324074074074074,\n \"acc_norm_stderr\": 0.03402801581358966\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8529411764705882,\n \"acc_stderr\": 0.024857478080250444,\n \"acc_norm\": 0.8529411764705882,\n \"acc_norm_stderr\": 0.024857478080250444\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8059071729957806,\n \"acc_stderr\": 0.025744902532290902,\n \"acc_norm\": 0.8059071729957806,\n \"acc_norm_stderr\": 0.025744902532290902\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7938931297709924,\n \"acc_stderr\": 0.03547771004159465,\n \"acc_norm\": 0.7938931297709924,\n \"acc_norm_stderr\": 0.03547771004159465\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8016528925619835,\n \"acc_stderr\": 0.03640118271990946,\n \"acc_norm\": 0.8016528925619835,\n \"acc_norm_stderr\": 0.03640118271990946\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04186091791394607,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04186091791394607\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.754601226993865,\n \"acc_stderr\": 0.03380939813943354,\n \"acc_norm\": 0.754601226993865,\n \"acc_norm_stderr\": 0.03380939813943354\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.44642857142857145,\n \"acc_stderr\": 0.04718471485219588,\n \"acc_norm\": 0.44642857142857145,\n \"acc_norm_stderr\": 0.04718471485219588\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8058252427184466,\n \"acc_stderr\": 0.03916667762822584,\n \"acc_norm\": 0.8058252427184466,\n \"acc_norm_stderr\": 0.03916667762822584\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8846153846153846,\n \"acc_stderr\": 0.02093019318517933,\n \"acc_norm\": 0.8846153846153846,\n \"acc_norm_stderr\": 0.02093019318517933\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.822477650063857,\n \"acc_stderr\": 0.01366423099583483,\n \"acc_norm\": 0.822477650063857,\n \"acc_norm_stderr\": 0.01366423099583483\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7369942196531792,\n \"acc_stderr\": 0.023703099525258176,\n \"acc_norm\": 0.7369942196531792,\n \"acc_norm_stderr\": 0.023703099525258176\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.42793296089385474,\n \"acc_stderr\": 0.01654788799741611,\n \"acc_norm\": 0.42793296089385474,\n \"acc_norm_stderr\": 0.01654788799741611\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7254901960784313,\n \"acc_stderr\": 0.025553169991826524,\n \"acc_norm\": 0.7254901960784313,\n \"acc_norm_stderr\": 0.025553169991826524\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.729903536977492,\n \"acc_stderr\": 0.02521804037341063,\n \"acc_norm\": 0.729903536977492,\n \"acc_norm_stderr\": 0.02521804037341063\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7469135802469136,\n \"acc_stderr\": 0.024191808600712995,\n \"acc_norm\": 0.7469135802469136,\n \"acc_norm_stderr\": 0.024191808600712995\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.48936170212765956,\n \"acc_stderr\": 0.029820747191422473,\n \"acc_norm\": 0.48936170212765956,\n \"acc_norm_stderr\": 0.029820747191422473\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.47131681877444587,\n \"acc_stderr\": 0.012749206007657474,\n \"acc_norm\": 0.47131681877444587,\n \"acc_norm_stderr\": 0.012749206007657474\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.028418208619406755,\n \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.028418208619406755\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6813725490196079,\n \"acc_stderr\": 0.01885008469646872,\n \"acc_norm\": 0.6813725490196079,\n \"acc_norm_stderr\": 0.01885008469646872\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6909090909090909,\n \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.6909090909090909,\n \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7346938775510204,\n \"acc_stderr\": 0.028263889943784593,\n \"acc_norm\": 0.7346938775510204,\n \"acc_norm_stderr\": 0.028263889943784593\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8258706467661692,\n \"acc_stderr\": 0.026814951200421603,\n \"acc_norm\": 0.8258706467661692,\n \"acc_norm_stderr\": 0.026814951200421603\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.85,\n \"acc_stderr\": 0.03588702812826371,\n \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.03588702812826371\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5602409638554217,\n \"acc_stderr\": 0.03864139923699122,\n \"acc_norm\": 0.5602409638554217,\n \"acc_norm_stderr\": 0.03864139923699122\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5667074663402693,\n \"mc1_stderr\": 0.017347024450107485,\n \"mc2\": 0.7091803983210125,\n \"mc2_stderr\": 0.01482201181219182\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8358326756116812,\n \"acc_stderr\": 0.010410849775222789\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6868840030326004,\n \"acc_stderr\": 0.012774285669385087\n }\n}\n```", "repo_url": "https://huggingface.co/jsfs11/MixtureofMerges-MoE-v2", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_02T00_33_54.387134", "path": ["**/details_harness|arc:challenge|25_2024-02-02T00-33-54.387134.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-02T00-33-54.387134.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_02T00_33_54.387134", "path": ["**/details_harness|gsm8k|5_2024-02-02T00-33-54.387134.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-02T00-33-54.387134.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_02T00_33_54.387134", "path": ["**/details_harness|hellaswag|10_2024-02-02T00-33-54.387134.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-02T00-33-54.387134.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_02T00_33_54.387134", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T00-33-54.387134.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-02T00-33-54.387134.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-02T00-33-54.387134.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T00-33-54.387134.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T00-33-54.387134.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-02T00-33-54.387134.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T00-33-54.387134.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T00-33-54.387134.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T00-33-54.387134.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T00-33-54.387134.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-02T00-33-54.387134.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-02T00-33-54.387134.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T00-33-54.387134.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-02T00-33-54.387134.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T00-33-54.387134.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T00-33-54.387134.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T00-33-54.387134.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-02T00-33-54.387134.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T00-33-54.387134.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T00-33-54.387134.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T00-33-54.387134.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T00-33-54.387134.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T00-33-54.387134.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T00-33-54.387134.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T00-33-54.387134.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T00-33-54.387134.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T00-33-54.387134.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T00-33-54.387134.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T00-33-54.387134.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T00-33-54.387134.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T00-33-54.387134.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T00-33-54.387134.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-02T00-33-54.387134.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T00-33-54.387134.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-02T00-33-54.387134.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T00-33-54.387134.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T00-33-54.387134.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T00-33-54.387134.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-02T00-33-54.387134.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-02T00-33-54.387134.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T00-33-54.387134.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T00-33-54.387134.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T00-33-54.387134.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T00-33-54.387134.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-02T00-33-54.387134.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-02T00-33-54.387134.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-02T00-33-54.387134.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T00-33-54.387134.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-02T00-33-54.387134.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T00-33-54.387134.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T00-33-54.387134.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-02T00-33-54.387134.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-02T00-33-54.387134.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-02T00-33-54.387134.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T00-33-54.387134.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-02T00-33-54.387134.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-02T00-33-54.387134.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T00-33-54.387134.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-02T00-33-54.387134.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-02T00-33-54.387134.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T00-33-54.387134.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T00-33-54.387134.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-02T00-33-54.387134.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T00-33-54.387134.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T00-33-54.387134.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T00-33-54.387134.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T00-33-54.387134.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-02T00-33-54.387134.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-02T00-33-54.387134.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T00-33-54.387134.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-02T00-33-54.387134.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T00-33-54.387134.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T00-33-54.387134.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T00-33-54.387134.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-02T00-33-54.387134.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T00-33-54.387134.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T00-33-54.387134.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T00-33-54.387134.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T00-33-54.387134.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T00-33-54.387134.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T00-33-54.387134.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T00-33-54.387134.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T00-33-54.387134.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T00-33-54.387134.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T00-33-54.387134.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T00-33-54.387134.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T00-33-54.387134.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T00-33-54.387134.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T00-33-54.387134.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-02T00-33-54.387134.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T00-33-54.387134.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-02T00-33-54.387134.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T00-33-54.387134.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T00-33-54.387134.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T00-33-54.387134.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-02T00-33-54.387134.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-02T00-33-54.387134.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T00-33-54.387134.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T00-33-54.387134.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T00-33-54.387134.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T00-33-54.387134.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-02T00-33-54.387134.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-02T00-33-54.387134.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-02T00-33-54.387134.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T00-33-54.387134.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-02T00-33-54.387134.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T00-33-54.387134.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T00-33-54.387134.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-02T00-33-54.387134.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-02T00-33-54.387134.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-02T00-33-54.387134.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T00-33-54.387134.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-02T00-33-54.387134.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-02T00-33-54.387134.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_02T00_33_54.387134", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T00-33-54.387134.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T00-33-54.387134.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_02T00_33_54.387134", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-02T00-33-54.387134.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-02T00-33-54.387134.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_02T00_33_54.387134", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-02T00-33-54.387134.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-02T00-33-54.387134.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_02T00_33_54.387134", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T00-33-54.387134.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T00-33-54.387134.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_02T00_33_54.387134", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T00-33-54.387134.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T00-33-54.387134.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_02T00_33_54.387134", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-02T00-33-54.387134.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-02T00-33-54.387134.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_02T00_33_54.387134", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T00-33-54.387134.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T00-33-54.387134.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_02T00_33_54.387134", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T00-33-54.387134.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T00-33-54.387134.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_02T00_33_54.387134", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T00-33-54.387134.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T00-33-54.387134.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_02T00_33_54.387134", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T00-33-54.387134.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T00-33-54.387134.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_02T00_33_54.387134", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-02T00-33-54.387134.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-02T00-33-54.387134.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_02T00_33_54.387134", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-02T00-33-54.387134.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-02T00-33-54.387134.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_02T00_33_54.387134", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T00-33-54.387134.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T00-33-54.387134.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_02T00_33_54.387134", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-02T00-33-54.387134.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-02T00-33-54.387134.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_02T00_33_54.387134", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T00-33-54.387134.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T00-33-54.387134.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_02T00_33_54.387134", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T00-33-54.387134.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T00-33-54.387134.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_02T00_33_54.387134", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T00-33-54.387134.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T00-33-54.387134.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_02T00_33_54.387134", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-02T00-33-54.387134.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-02T00-33-54.387134.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_02T00_33_54.387134", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T00-33-54.387134.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T00-33-54.387134.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_02T00_33_54.387134", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T00-33-54.387134.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T00-33-54.387134.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_02T00_33_54.387134", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T00-33-54.387134.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T00-33-54.387134.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_02T00_33_54.387134", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T00-33-54.387134.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T00-33-54.387134.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_02T00_33_54.387134", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T00-33-54.387134.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T00-33-54.387134.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_02T00_33_54.387134", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T00-33-54.387134.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T00-33-54.387134.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_02T00_33_54.387134", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T00-33-54.387134.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T00-33-54.387134.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_02T00_33_54.387134", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T00-33-54.387134.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T00-33-54.387134.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_02T00_33_54.387134", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T00-33-54.387134.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T00-33-54.387134.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_02T00_33_54.387134", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T00-33-54.387134.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T00-33-54.387134.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_02T00_33_54.387134", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T00-33-54.387134.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T00-33-54.387134.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_02T00_33_54.387134", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T00-33-54.387134.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T00-33-54.387134.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_02T00_33_54.387134", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T00-33-54.387134.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T00-33-54.387134.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_02T00_33_54.387134", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T00-33-54.387134.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T00-33-54.387134.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_02T00_33_54.387134", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-02T00-33-54.387134.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-02T00-33-54.387134.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_02T00_33_54.387134", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T00-33-54.387134.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T00-33-54.387134.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_02T00_33_54.387134", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-02T00-33-54.387134.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-02T00-33-54.387134.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_02T00_33_54.387134", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T00-33-54.387134.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T00-33-54.387134.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_02T00_33_54.387134", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T00-33-54.387134.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T00-33-54.387134.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_02T00_33_54.387134", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T00-33-54.387134.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T00-33-54.387134.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_02T00_33_54.387134", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-02T00-33-54.387134.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-02T00-33-54.387134.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_02T00_33_54.387134", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-02T00-33-54.387134.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-02T00-33-54.387134.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_02T00_33_54.387134", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T00-33-54.387134.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T00-33-54.387134.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_02T00_33_54.387134", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T00-33-54.387134.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T00-33-54.387134.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_02T00_33_54.387134", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T00-33-54.387134.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T00-33-54.387134.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_02T00_33_54.387134", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T00-33-54.387134.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T00-33-54.387134.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_02T00_33_54.387134", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-02T00-33-54.387134.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-02T00-33-54.387134.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_02T00_33_54.387134", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-02T00-33-54.387134.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-02T00-33-54.387134.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_02T00_33_54.387134", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-02T00-33-54.387134.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-02T00-33-54.387134.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_02T00_33_54.387134", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T00-33-54.387134.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T00-33-54.387134.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_02T00_33_54.387134", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-02T00-33-54.387134.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-02T00-33-54.387134.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_02T00_33_54.387134", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T00-33-54.387134.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T00-33-54.387134.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_02T00_33_54.387134", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T00-33-54.387134.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T00-33-54.387134.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_02T00_33_54.387134", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-02T00-33-54.387134.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-02T00-33-54.387134.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_02T00_33_54.387134", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-02T00-33-54.387134.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-02T00-33-54.387134.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_02T00_33_54.387134", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-02T00-33-54.387134.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-02T00-33-54.387134.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_02T00_33_54.387134", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T00-33-54.387134.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T00-33-54.387134.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_02T00_33_54.387134", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-02T00-33-54.387134.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-02T00-33-54.387134.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_02T00_33_54.387134", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-02T00-33-54.387134.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-02T00-33-54.387134.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_02T00_33_54.387134", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-02T00-33-54.387134.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-02T00-33-54.387134.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_02T00_33_54.387134", "path": ["**/details_harness|winogrande|5_2024-02-02T00-33-54.387134.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-02T00-33-54.387134.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_02T00_33_54.387134", "path": ["results_2024-02-02T00-33-54.387134.parquet"]}, {"split": "latest", "path": ["results_2024-02-02T00-33-54.387134.parquet"]}]}]}
2024-02-02T00:36:40+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of jsfs11/MixtureofMerges-MoE-v2 Dataset automatically created during the evaluation run of model jsfs11/MixtureofMerges-MoE-v2 on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-02-02T00:33:54.387134(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of jsfs11/MixtureofMerges-MoE-v2\n\n\n\nDataset automatically created during the evaluation run of model jsfs11/MixtureofMerges-MoE-v2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-02-02T00:33:54.387134(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of jsfs11/MixtureofMerges-MoE-v2\n\n\n\nDataset automatically created during the evaluation run of model jsfs11/MixtureofMerges-MoE-v2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-02-02T00:33:54.387134(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
fa1e5d2848511c3fbc7381a3f09f46ceabb220ad
# Dataset Card for Evaluation run of traversaal-ai/traversaal-2.5-Mistral-7B <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [traversaal-ai/traversaal-2.5-Mistral-7B](https://huggingface.co/traversaal-ai/traversaal-2.5-Mistral-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_traversaal-ai__traversaal-2.5-Mistral-7B", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-02-02T00:34:42.679909](https://huggingface.co/datasets/open-llm-leaderboard/details_traversaal-ai__traversaal-2.5-Mistral-7B/blob/main/results_2024-02-02T00-34-42.679909.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6274978626802366, "acc_stderr": 0.03223484025810943, "acc_norm": 0.6366437826897652, "acc_norm_stderr": 0.032931031908270264, "mc1": 0.3708690330477356, "mc1_stderr": 0.01690969358024882, "mc2": 0.5399700915456362, "mc2_stderr": 0.015353094182217303 }, "harness|arc:challenge|25": { "acc": 0.6237201365187713, "acc_stderr": 0.014157022555407161, "acc_norm": 0.6621160409556314, "acc_norm_stderr": 0.013822047922283509 }, "harness|hellaswag|10": { "acc": 0.6597291376219877, "acc_stderr": 0.00472831857783521, "acc_norm": 0.850229038040231, "acc_norm_stderr": 0.0035611748104545588 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.31, "acc_stderr": 0.046482319871173156, "acc_norm": 0.31, "acc_norm_stderr": 0.046482319871173156 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.5851851851851851, "acc_stderr": 0.04256193767901408, "acc_norm": 0.5851851851851851, "acc_norm_stderr": 0.04256193767901408 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.7105263157894737, "acc_stderr": 0.03690677986137283, "acc_norm": 0.7105263157894737, "acc_norm_stderr": 0.03690677986137283 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.57, "acc_stderr": 0.049756985195624284, "acc_norm": 0.57, "acc_norm_stderr": 0.049756985195624284 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.6792452830188679, "acc_stderr": 0.02872750295788027, "acc_norm": 0.6792452830188679, "acc_norm_stderr": 0.02872750295788027 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.7569444444444444, "acc_stderr": 0.03586879280080341, "acc_norm": 0.7569444444444444, "acc_norm_stderr": 0.03586879280080341 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.44, "acc_stderr": 0.04988876515698589, "acc_norm": 0.44, "acc_norm_stderr": 0.04988876515698589 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.44, "acc_stderr": 0.049888765156985884, "acc_norm": 0.44, "acc_norm_stderr": 0.049888765156985884 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.27, "acc_stderr": 0.044619604333847394, "acc_norm": 0.27, "acc_norm_stderr": 0.044619604333847394 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6127167630057804, "acc_stderr": 0.03714325906302065, "acc_norm": 0.6127167630057804, "acc_norm_stderr": 0.03714325906302065 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.37254901960784315, "acc_stderr": 0.04810840148082635, "acc_norm": 0.37254901960784315, "acc_norm_stderr": 0.04810840148082635 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.77, "acc_stderr": 0.042295258468165065, "acc_norm": 0.77, "acc_norm_stderr": 0.042295258468165065 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.548936170212766, "acc_stderr": 0.032529096196131965, "acc_norm": 0.548936170212766, "acc_norm_stderr": 0.032529096196131965 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.4824561403508772, "acc_stderr": 0.0470070803355104, "acc_norm": 0.4824561403508772, "acc_norm_stderr": 0.0470070803355104 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5310344827586206, "acc_stderr": 0.04158632762097828, "acc_norm": 0.5310344827586206, "acc_norm_stderr": 0.04158632762097828 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.42857142857142855, "acc_stderr": 0.02548718714785938, "acc_norm": 0.42857142857142855, "acc_norm_stderr": 0.02548718714785938 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.46825396825396826, "acc_stderr": 0.04463112720677172, "acc_norm": 0.46825396825396826, "acc_norm_stderr": 0.04463112720677172 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.34, "acc_stderr": 0.04760952285695235, "acc_norm": 0.34, "acc_norm_stderr": 0.04760952285695235 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.7709677419354839, "acc_stderr": 0.023904914311782655, "acc_norm": 0.7709677419354839, "acc_norm_stderr": 0.023904914311782655 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.541871921182266, "acc_stderr": 0.03505630140785741, "acc_norm": 0.541871921182266, "acc_norm_stderr": 0.03505630140785741 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.65, "acc_stderr": 0.047937248544110196, "acc_norm": 0.65, "acc_norm_stderr": 0.047937248544110196 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.793939393939394, "acc_stderr": 0.03158415324047711, "acc_norm": 0.793939393939394, "acc_norm_stderr": 0.03158415324047711 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.7828282828282829, "acc_stderr": 0.029376616484945633, "acc_norm": 0.7828282828282829, "acc_norm_stderr": 0.029376616484945633 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.8911917098445595, "acc_stderr": 0.022473253332768776, "acc_norm": 0.8911917098445595, "acc_norm_stderr": 0.022473253332768776 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.6128205128205129, "acc_stderr": 0.024697216930878937, "acc_norm": 0.6128205128205129, "acc_norm_stderr": 0.024697216930878937 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.3074074074074074, "acc_stderr": 0.02813325257881564, "acc_norm": 0.3074074074074074, "acc_norm_stderr": 0.02813325257881564 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.6722689075630253, "acc_stderr": 0.03048991141767323, "acc_norm": 0.6722689075630253, "acc_norm_stderr": 0.03048991141767323 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.32450331125827814, "acc_stderr": 0.03822746937658752, "acc_norm": 0.32450331125827814, "acc_norm_stderr": 0.03822746937658752 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8220183486238533, "acc_stderr": 0.016399436366612896, "acc_norm": 0.8220183486238533, "acc_norm_stderr": 0.016399436366612896 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.5138888888888888, "acc_stderr": 0.034086558679777494, "acc_norm": 0.5138888888888888, "acc_norm_stderr": 0.034086558679777494 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.7794117647058824, "acc_stderr": 0.02910225438967407, "acc_norm": 0.7794117647058824, "acc_norm_stderr": 0.02910225438967407 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.8016877637130801, "acc_stderr": 0.02595502084162113, "acc_norm": 0.8016877637130801, "acc_norm_stderr": 0.02595502084162113 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.7085201793721974, "acc_stderr": 0.030500283176545847, "acc_norm": 0.7085201793721974, "acc_norm_stderr": 0.030500283176545847 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.8015267175572519, "acc_stderr": 0.03498149385462472, "acc_norm": 0.8015267175572519, "acc_norm_stderr": 0.03498149385462472 }, "harness|hendrycksTest-international_law|5": { "acc": 0.768595041322314, "acc_stderr": 0.03849856098794088, "acc_norm": 0.768595041322314, "acc_norm_stderr": 0.03849856098794088 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7870370370370371, "acc_stderr": 0.03957835471980981, "acc_norm": 0.7870370370370371, "acc_norm_stderr": 0.03957835471980981 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7975460122699386, "acc_stderr": 0.031570650789119005, "acc_norm": 0.7975460122699386, "acc_norm_stderr": 0.031570650789119005 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.49107142857142855, "acc_stderr": 0.04745033255489123, "acc_norm": 0.49107142857142855, "acc_norm_stderr": 0.04745033255489123 }, "harness|hendrycksTest-management|5": { "acc": 0.7669902912621359, "acc_stderr": 0.04185832598928315, "acc_norm": 0.7669902912621359, "acc_norm_stderr": 0.04185832598928315 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8547008547008547, "acc_stderr": 0.023086635086841407, "acc_norm": 0.8547008547008547, "acc_norm_stderr": 0.023086635086841407 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.7, "acc_stderr": 0.046056618647183814, "acc_norm": 0.7, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8199233716475096, "acc_stderr": 0.013740797258579825, "acc_norm": 0.8199233716475096, "acc_norm_stderr": 0.013740797258579825 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.7225433526011561, "acc_stderr": 0.024105712607754307, "acc_norm": 0.7225433526011561, "acc_norm_stderr": 0.024105712607754307 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.32513966480446926, "acc_stderr": 0.01566654278505355, "acc_norm": 0.32513966480446926, "acc_norm_stderr": 0.01566654278505355 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.7483660130718954, "acc_stderr": 0.024848018263875192, "acc_norm": 0.7483660130718954, "acc_norm_stderr": 0.024848018263875192 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.6881028938906752, "acc_stderr": 0.02631185807185416, "acc_norm": 0.6881028938906752, "acc_norm_stderr": 0.02631185807185416 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.7438271604938271, "acc_stderr": 0.024288533637726095, "acc_norm": 0.7438271604938271, "acc_norm_stderr": 0.024288533637726095 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.5177304964539007, "acc_stderr": 0.02980873964223777, "acc_norm": 0.5177304964539007, "acc_norm_stderr": 0.02980873964223777 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.4667535853976532, "acc_stderr": 0.01274197433389723, "acc_norm": 0.4667535853976532, "acc_norm_stderr": 0.01274197433389723 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.6691176470588235, "acc_stderr": 0.028582709753898452, "acc_norm": 0.6691176470588235, "acc_norm_stderr": 0.028582709753898452 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6633986928104575, "acc_stderr": 0.019117213911495155, "acc_norm": 0.6633986928104575, "acc_norm_stderr": 0.019117213911495155 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6363636363636364, "acc_stderr": 0.046075820907199756, "acc_norm": 0.6363636363636364, "acc_norm_stderr": 0.046075820907199756 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.7306122448979592, "acc_stderr": 0.02840125202902294, "acc_norm": 0.7306122448979592, "acc_norm_stderr": 0.02840125202902294 }, "harness|hendrycksTest-sociology|5": { "acc": 0.8159203980099502, "acc_stderr": 0.027403859410786862, "acc_norm": 0.8159203980099502, "acc_norm_stderr": 0.027403859410786862 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.86, "acc_stderr": 0.03487350880197769, "acc_norm": 0.86, "acc_norm_stderr": 0.03487350880197769 }, "harness|hendrycksTest-virology|5": { "acc": 0.5542168674698795, "acc_stderr": 0.03869543323472101, "acc_norm": 0.5542168674698795, "acc_norm_stderr": 0.03869543323472101 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8304093567251462, "acc_stderr": 0.02878210810540171, "acc_norm": 0.8304093567251462, "acc_norm_stderr": 0.02878210810540171 }, "harness|truthfulqa:mc|0": { "mc1": 0.3708690330477356, "mc1_stderr": 0.01690969358024882, "mc2": 0.5399700915456362, "mc2_stderr": 0.015353094182217303 }, "harness|winogrande|5": { "acc": 0.7790055248618785, "acc_stderr": 0.011661223637643412 }, "harness|gsm8k|5": { "acc": 0.1652767247915087, "acc_stderr": 0.010231031118582121 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_traversaal-ai__traversaal-2.5-Mistral-7B
[ "region:us" ]
2024-02-02T00:36:59+00:00
{"pretty_name": "Evaluation run of traversaal-ai/traversaal-2.5-Mistral-7B", "dataset_summary": "Dataset automatically created during the evaluation run of model [traversaal-ai/traversaal-2.5-Mistral-7B](https://huggingface.co/traversaal-ai/traversaal-2.5-Mistral-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_traversaal-ai__traversaal-2.5-Mistral-7B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-02T00:34:42.679909](https://huggingface.co/datasets/open-llm-leaderboard/details_traversaal-ai__traversaal-2.5-Mistral-7B/blob/main/results_2024-02-02T00-34-42.679909.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6274978626802366,\n \"acc_stderr\": 0.03223484025810943,\n \"acc_norm\": 0.6366437826897652,\n \"acc_norm_stderr\": 0.032931031908270264,\n \"mc1\": 0.3708690330477356,\n \"mc1_stderr\": 0.01690969358024882,\n \"mc2\": 0.5399700915456362,\n \"mc2_stderr\": 0.015353094182217303\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6237201365187713,\n \"acc_stderr\": 0.014157022555407161,\n \"acc_norm\": 0.6621160409556314,\n \"acc_norm_stderr\": 0.013822047922283509\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6597291376219877,\n \"acc_stderr\": 0.00472831857783521,\n \"acc_norm\": 0.850229038040231,\n \"acc_norm_stderr\": 0.0035611748104545588\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.046482319871173156,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.046482319871173156\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5851851851851851,\n \"acc_stderr\": 0.04256193767901408,\n \"acc_norm\": 0.5851851851851851,\n \"acc_norm_stderr\": 0.04256193767901408\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7105263157894737,\n \"acc_stderr\": 0.03690677986137283,\n \"acc_norm\": 0.7105263157894737,\n \"acc_norm_stderr\": 0.03690677986137283\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.57,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6792452830188679,\n \"acc_stderr\": 0.02872750295788027,\n \"acc_norm\": 0.6792452830188679,\n \"acc_norm_stderr\": 0.02872750295788027\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7569444444444444,\n \"acc_stderr\": 0.03586879280080341,\n \"acc_norm\": 0.7569444444444444,\n \"acc_norm_stderr\": 0.03586879280080341\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.44,\n \"acc_stderr\": 0.049888765156985884,\n \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.049888765156985884\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6127167630057804,\n \"acc_stderr\": 0.03714325906302065,\n \"acc_norm\": 0.6127167630057804,\n \"acc_norm_stderr\": 0.03714325906302065\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.37254901960784315,\n \"acc_stderr\": 0.04810840148082635,\n \"acc_norm\": 0.37254901960784315,\n \"acc_norm_stderr\": 0.04810840148082635\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.77,\n \"acc_stderr\": 0.042295258468165065,\n \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.042295258468165065\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.548936170212766,\n \"acc_stderr\": 0.032529096196131965,\n \"acc_norm\": 0.548936170212766,\n \"acc_norm_stderr\": 0.032529096196131965\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n \"acc_stderr\": 0.0470070803355104,\n \"acc_norm\": 0.4824561403508772,\n \"acc_norm_stderr\": 0.0470070803355104\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5310344827586206,\n \"acc_stderr\": 0.04158632762097828,\n \"acc_norm\": 0.5310344827586206,\n \"acc_norm_stderr\": 0.04158632762097828\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.42857142857142855,\n \"acc_stderr\": 0.02548718714785938,\n \"acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.02548718714785938\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.46825396825396826,\n \"acc_stderr\": 0.04463112720677172,\n \"acc_norm\": 0.46825396825396826,\n \"acc_norm_stderr\": 0.04463112720677172\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7709677419354839,\n \"acc_stderr\": 0.023904914311782655,\n \"acc_norm\": 0.7709677419354839,\n \"acc_norm_stderr\": 0.023904914311782655\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.541871921182266,\n \"acc_stderr\": 0.03505630140785741,\n \"acc_norm\": 0.541871921182266,\n \"acc_norm_stderr\": 0.03505630140785741\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.65,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.793939393939394,\n \"acc_stderr\": 0.03158415324047711,\n \"acc_norm\": 0.793939393939394,\n \"acc_norm_stderr\": 0.03158415324047711\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7828282828282829,\n \"acc_stderr\": 0.029376616484945633,\n \"acc_norm\": 0.7828282828282829,\n \"acc_norm_stderr\": 0.029376616484945633\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8911917098445595,\n \"acc_stderr\": 0.022473253332768776,\n \"acc_norm\": 0.8911917098445595,\n \"acc_norm_stderr\": 0.022473253332768776\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6128205128205129,\n \"acc_stderr\": 0.024697216930878937,\n \"acc_norm\": 0.6128205128205129,\n \"acc_norm_stderr\": 0.024697216930878937\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3074074074074074,\n \"acc_stderr\": 0.02813325257881564,\n \"acc_norm\": 0.3074074074074074,\n \"acc_norm_stderr\": 0.02813325257881564\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6722689075630253,\n \"acc_stderr\": 0.03048991141767323,\n \"acc_norm\": 0.6722689075630253,\n \"acc_norm_stderr\": 0.03048991141767323\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.32450331125827814,\n \"acc_stderr\": 0.03822746937658752,\n \"acc_norm\": 0.32450331125827814,\n \"acc_norm_stderr\": 0.03822746937658752\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8220183486238533,\n \"acc_stderr\": 0.016399436366612896,\n \"acc_norm\": 0.8220183486238533,\n \"acc_norm_stderr\": 0.016399436366612896\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5138888888888888,\n \"acc_stderr\": 0.034086558679777494,\n \"acc_norm\": 0.5138888888888888,\n \"acc_norm_stderr\": 0.034086558679777494\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7794117647058824,\n \"acc_stderr\": 0.02910225438967407,\n \"acc_norm\": 0.7794117647058824,\n \"acc_norm_stderr\": 0.02910225438967407\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8016877637130801,\n \"acc_stderr\": 0.02595502084162113,\n \"acc_norm\": 0.8016877637130801,\n \"acc_norm_stderr\": 0.02595502084162113\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7085201793721974,\n \"acc_stderr\": 0.030500283176545847,\n \"acc_norm\": 0.7085201793721974,\n \"acc_norm_stderr\": 0.030500283176545847\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8015267175572519,\n \"acc_stderr\": 0.03498149385462472,\n \"acc_norm\": 0.8015267175572519,\n \"acc_norm_stderr\": 0.03498149385462472\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.768595041322314,\n \"acc_stderr\": 0.03849856098794088,\n \"acc_norm\": 0.768595041322314,\n \"acc_norm_stderr\": 0.03849856098794088\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n \"acc_stderr\": 0.03957835471980981,\n \"acc_norm\": 0.7870370370370371,\n \"acc_norm_stderr\": 0.03957835471980981\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7975460122699386,\n \"acc_stderr\": 0.031570650789119005,\n \"acc_norm\": 0.7975460122699386,\n \"acc_norm_stderr\": 0.031570650789119005\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.49107142857142855,\n \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.49107142857142855,\n \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8547008547008547,\n \"acc_stderr\": 0.023086635086841407,\n \"acc_norm\": 0.8547008547008547,\n \"acc_norm_stderr\": 0.023086635086841407\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8199233716475096,\n \"acc_stderr\": 0.013740797258579825,\n \"acc_norm\": 0.8199233716475096,\n \"acc_norm_stderr\": 0.013740797258579825\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7225433526011561,\n \"acc_stderr\": 0.024105712607754307,\n \"acc_norm\": 0.7225433526011561,\n \"acc_norm_stderr\": 0.024105712607754307\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.32513966480446926,\n \"acc_stderr\": 0.01566654278505355,\n \"acc_norm\": 0.32513966480446926,\n \"acc_norm_stderr\": 0.01566654278505355\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7483660130718954,\n \"acc_stderr\": 0.024848018263875192,\n \"acc_norm\": 0.7483660130718954,\n \"acc_norm_stderr\": 0.024848018263875192\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6881028938906752,\n \"acc_stderr\": 0.02631185807185416,\n \"acc_norm\": 0.6881028938906752,\n \"acc_norm_stderr\": 0.02631185807185416\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7438271604938271,\n \"acc_stderr\": 0.024288533637726095,\n \"acc_norm\": 0.7438271604938271,\n \"acc_norm_stderr\": 0.024288533637726095\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.5177304964539007,\n \"acc_stderr\": 0.02980873964223777,\n \"acc_norm\": 0.5177304964539007,\n \"acc_norm_stderr\": 0.02980873964223777\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4667535853976532,\n \"acc_stderr\": 0.01274197433389723,\n \"acc_norm\": 0.4667535853976532,\n \"acc_norm_stderr\": 0.01274197433389723\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6691176470588235,\n \"acc_stderr\": 0.028582709753898452,\n \"acc_norm\": 0.6691176470588235,\n \"acc_norm_stderr\": 0.028582709753898452\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6633986928104575,\n \"acc_stderr\": 0.019117213911495155,\n \"acc_norm\": 0.6633986928104575,\n \"acc_norm_stderr\": 0.019117213911495155\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6363636363636364,\n \"acc_stderr\": 0.046075820907199756,\n \"acc_norm\": 0.6363636363636364,\n \"acc_norm_stderr\": 0.046075820907199756\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7306122448979592,\n \"acc_stderr\": 0.02840125202902294,\n \"acc_norm\": 0.7306122448979592,\n \"acc_norm_stderr\": 0.02840125202902294\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8159203980099502,\n \"acc_stderr\": 0.027403859410786862,\n \"acc_norm\": 0.8159203980099502,\n \"acc_norm_stderr\": 0.027403859410786862\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.86,\n \"acc_stderr\": 0.03487350880197769,\n \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.03487350880197769\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5542168674698795,\n \"acc_stderr\": 0.03869543323472101,\n \"acc_norm\": 0.5542168674698795,\n \"acc_norm_stderr\": 0.03869543323472101\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3708690330477356,\n \"mc1_stderr\": 0.01690969358024882,\n \"mc2\": 0.5399700915456362,\n \"mc2_stderr\": 0.015353094182217303\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7790055248618785,\n \"acc_stderr\": 0.011661223637643412\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.1652767247915087,\n \"acc_stderr\": 0.010231031118582121\n }\n}\n```", "repo_url": "https://huggingface.co/traversaal-ai/traversaal-2.5-Mistral-7B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_02T00_34_42.679909", "path": ["**/details_harness|arc:challenge|25_2024-02-02T00-34-42.679909.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-02T00-34-42.679909.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_02T00_34_42.679909", "path": ["**/details_harness|gsm8k|5_2024-02-02T00-34-42.679909.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-02T00-34-42.679909.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_02T00_34_42.679909", "path": ["**/details_harness|hellaswag|10_2024-02-02T00-34-42.679909.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-02T00-34-42.679909.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_02T00_34_42.679909", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T00-34-42.679909.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-02T00-34-42.679909.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-02T00-34-42.679909.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T00-34-42.679909.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T00-34-42.679909.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-02T00-34-42.679909.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T00-34-42.679909.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T00-34-42.679909.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T00-34-42.679909.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T00-34-42.679909.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-02T00-34-42.679909.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-02T00-34-42.679909.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T00-34-42.679909.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-02T00-34-42.679909.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T00-34-42.679909.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T00-34-42.679909.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T00-34-42.679909.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-02T00-34-42.679909.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T00-34-42.679909.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T00-34-42.679909.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T00-34-42.679909.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T00-34-42.679909.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T00-34-42.679909.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T00-34-42.679909.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T00-34-42.679909.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T00-34-42.679909.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T00-34-42.679909.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T00-34-42.679909.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T00-34-42.679909.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T00-34-42.679909.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T00-34-42.679909.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T00-34-42.679909.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-02T00-34-42.679909.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T00-34-42.679909.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-02T00-34-42.679909.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T00-34-42.679909.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T00-34-42.679909.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T00-34-42.679909.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-02T00-34-42.679909.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-02T00-34-42.679909.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T00-34-42.679909.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T00-34-42.679909.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T00-34-42.679909.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T00-34-42.679909.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-02T00-34-42.679909.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-02T00-34-42.679909.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-02T00-34-42.679909.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T00-34-42.679909.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-02T00-34-42.679909.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T00-34-42.679909.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T00-34-42.679909.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-02T00-34-42.679909.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-02T00-34-42.679909.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-02T00-34-42.679909.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T00-34-42.679909.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-02T00-34-42.679909.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-02T00-34-42.679909.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T00-34-42.679909.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-02T00-34-42.679909.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-02T00-34-42.679909.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T00-34-42.679909.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T00-34-42.679909.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-02T00-34-42.679909.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T00-34-42.679909.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T00-34-42.679909.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T00-34-42.679909.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T00-34-42.679909.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-02T00-34-42.679909.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-02T00-34-42.679909.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T00-34-42.679909.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-02T00-34-42.679909.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T00-34-42.679909.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T00-34-42.679909.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T00-34-42.679909.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-02T00-34-42.679909.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T00-34-42.679909.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T00-34-42.679909.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T00-34-42.679909.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T00-34-42.679909.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T00-34-42.679909.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T00-34-42.679909.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T00-34-42.679909.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T00-34-42.679909.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T00-34-42.679909.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T00-34-42.679909.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T00-34-42.679909.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T00-34-42.679909.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T00-34-42.679909.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T00-34-42.679909.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-02T00-34-42.679909.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T00-34-42.679909.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-02T00-34-42.679909.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T00-34-42.679909.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T00-34-42.679909.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T00-34-42.679909.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-02T00-34-42.679909.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-02T00-34-42.679909.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T00-34-42.679909.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T00-34-42.679909.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T00-34-42.679909.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T00-34-42.679909.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-02T00-34-42.679909.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-02T00-34-42.679909.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-02T00-34-42.679909.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T00-34-42.679909.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-02T00-34-42.679909.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T00-34-42.679909.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T00-34-42.679909.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-02T00-34-42.679909.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-02T00-34-42.679909.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-02T00-34-42.679909.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T00-34-42.679909.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-02T00-34-42.679909.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-02T00-34-42.679909.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_02T00_34_42.679909", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T00-34-42.679909.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T00-34-42.679909.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_02T00_34_42.679909", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-02T00-34-42.679909.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-02T00-34-42.679909.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_02T00_34_42.679909", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-02T00-34-42.679909.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-02T00-34-42.679909.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_02T00_34_42.679909", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T00-34-42.679909.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T00-34-42.679909.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_02T00_34_42.679909", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T00-34-42.679909.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T00-34-42.679909.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_02T00_34_42.679909", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-02T00-34-42.679909.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-02T00-34-42.679909.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_02T00_34_42.679909", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T00-34-42.679909.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T00-34-42.679909.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_02T00_34_42.679909", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T00-34-42.679909.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T00-34-42.679909.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_02T00_34_42.679909", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T00-34-42.679909.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T00-34-42.679909.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_02T00_34_42.679909", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T00-34-42.679909.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T00-34-42.679909.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_02T00_34_42.679909", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-02T00-34-42.679909.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-02T00-34-42.679909.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_02T00_34_42.679909", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-02T00-34-42.679909.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-02T00-34-42.679909.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_02T00_34_42.679909", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T00-34-42.679909.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T00-34-42.679909.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_02T00_34_42.679909", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-02T00-34-42.679909.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-02T00-34-42.679909.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_02T00_34_42.679909", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T00-34-42.679909.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T00-34-42.679909.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_02T00_34_42.679909", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T00-34-42.679909.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T00-34-42.679909.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_02T00_34_42.679909", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T00-34-42.679909.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T00-34-42.679909.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_02T00_34_42.679909", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-02T00-34-42.679909.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-02T00-34-42.679909.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_02T00_34_42.679909", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T00-34-42.679909.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T00-34-42.679909.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_02T00_34_42.679909", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T00-34-42.679909.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T00-34-42.679909.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_02T00_34_42.679909", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T00-34-42.679909.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T00-34-42.679909.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_02T00_34_42.679909", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T00-34-42.679909.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T00-34-42.679909.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_02T00_34_42.679909", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T00-34-42.679909.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T00-34-42.679909.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_02T00_34_42.679909", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T00-34-42.679909.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T00-34-42.679909.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_02T00_34_42.679909", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T00-34-42.679909.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T00-34-42.679909.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_02T00_34_42.679909", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T00-34-42.679909.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T00-34-42.679909.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_02T00_34_42.679909", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T00-34-42.679909.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T00-34-42.679909.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_02T00_34_42.679909", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T00-34-42.679909.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T00-34-42.679909.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_02T00_34_42.679909", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T00-34-42.679909.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T00-34-42.679909.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_02T00_34_42.679909", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T00-34-42.679909.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T00-34-42.679909.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_02T00_34_42.679909", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T00-34-42.679909.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T00-34-42.679909.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_02T00_34_42.679909", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T00-34-42.679909.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T00-34-42.679909.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_02T00_34_42.679909", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-02T00-34-42.679909.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-02T00-34-42.679909.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_02T00_34_42.679909", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T00-34-42.679909.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T00-34-42.679909.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_02T00_34_42.679909", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-02T00-34-42.679909.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-02T00-34-42.679909.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_02T00_34_42.679909", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T00-34-42.679909.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T00-34-42.679909.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_02T00_34_42.679909", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T00-34-42.679909.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T00-34-42.679909.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_02T00_34_42.679909", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T00-34-42.679909.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T00-34-42.679909.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_02T00_34_42.679909", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-02T00-34-42.679909.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-02T00-34-42.679909.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_02T00_34_42.679909", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-02T00-34-42.679909.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-02T00-34-42.679909.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_02T00_34_42.679909", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T00-34-42.679909.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T00-34-42.679909.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_02T00_34_42.679909", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T00-34-42.679909.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T00-34-42.679909.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_02T00_34_42.679909", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T00-34-42.679909.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T00-34-42.679909.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_02T00_34_42.679909", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T00-34-42.679909.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T00-34-42.679909.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_02T00_34_42.679909", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-02T00-34-42.679909.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-02T00-34-42.679909.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_02T00_34_42.679909", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-02T00-34-42.679909.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-02T00-34-42.679909.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_02T00_34_42.679909", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-02T00-34-42.679909.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-02T00-34-42.679909.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_02T00_34_42.679909", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T00-34-42.679909.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T00-34-42.679909.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_02T00_34_42.679909", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-02T00-34-42.679909.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-02T00-34-42.679909.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_02T00_34_42.679909", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T00-34-42.679909.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T00-34-42.679909.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_02T00_34_42.679909", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T00-34-42.679909.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T00-34-42.679909.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_02T00_34_42.679909", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-02T00-34-42.679909.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-02T00-34-42.679909.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_02T00_34_42.679909", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-02T00-34-42.679909.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-02T00-34-42.679909.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_02T00_34_42.679909", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-02T00-34-42.679909.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-02T00-34-42.679909.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_02T00_34_42.679909", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T00-34-42.679909.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T00-34-42.679909.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_02T00_34_42.679909", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-02T00-34-42.679909.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-02T00-34-42.679909.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_02T00_34_42.679909", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-02T00-34-42.679909.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-02T00-34-42.679909.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_02T00_34_42.679909", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-02T00-34-42.679909.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-02T00-34-42.679909.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_02T00_34_42.679909", "path": ["**/details_harness|winogrande|5_2024-02-02T00-34-42.679909.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-02T00-34-42.679909.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_02T00_34_42.679909", "path": ["results_2024-02-02T00-34-42.679909.parquet"]}, {"split": "latest", "path": ["results_2024-02-02T00-34-42.679909.parquet"]}]}]}
2024-02-02T00:37:26+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of traversaal-ai/traversaal-2.5-Mistral-7B Dataset automatically created during the evaluation run of model traversaal-ai/traversaal-2.5-Mistral-7B on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-02-02T00:34:42.679909(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of traversaal-ai/traversaal-2.5-Mistral-7B\n\n\n\nDataset automatically created during the evaluation run of model traversaal-ai/traversaal-2.5-Mistral-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-02-02T00:34:42.679909(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of traversaal-ai/traversaal-2.5-Mistral-7B\n\n\n\nDataset automatically created during the evaluation run of model traversaal-ai/traversaal-2.5-Mistral-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-02-02T00:34:42.679909(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
fc914fec321711208eb6d65376fa14b8864e1ad8
# Dataset Card for Evaluation run of M4-ai/TinyMistral-6x248M-Instruct <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [M4-ai/TinyMistral-6x248M-Instruct](https://huggingface.co/M4-ai/TinyMistral-6x248M-Instruct) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_M4-ai__TinyMistral-6x248M-Instruct", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-02-02T00:43:37.562563](https://huggingface.co/datasets/open-llm-leaderboard/details_M4-ai__TinyMistral-6x248M-Instruct/blob/main/results_2024-02-02T00-43-37.562563.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.24095884646371188, "acc_stderr": 0.03008991362961084, "acc_norm": 0.24148314901574003, "acc_norm_stderr": 0.03089156878834565, "mc1": 0.2460220318237454, "mc1_stderr": 0.015077219200662595, "mc2": 0.4316446283639332, "mc2_stderr": 0.015551246607562378 }, "harness|arc:challenge|25": { "acc": 0.17235494880546076, "acc_stderr": 0.011037113093461295, "acc_norm": 0.22440273037542663, "acc_norm_stderr": 0.012191404938603836 }, "harness|hellaswag|10": { "acc": 0.2672774347739494, "acc_stderr": 0.004416339450436124, "acc_norm": 0.27016530571599284, "acc_norm_stderr": 0.004431375549911368 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.22, "acc_stderr": 0.04163331998932268, "acc_norm": 0.22, "acc_norm_stderr": 0.04163331998932268 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.31851851851851853, "acc_stderr": 0.040247784019771096, "acc_norm": 0.31851851851851853, "acc_norm_stderr": 0.040247784019771096 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.17763157894736842, "acc_stderr": 0.031103182383123398, "acc_norm": 0.17763157894736842, "acc_norm_stderr": 0.031103182383123398 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.19, "acc_stderr": 0.03942772444036625, "acc_norm": 0.19, "acc_norm_stderr": 0.03942772444036625 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.21509433962264152, "acc_stderr": 0.025288394502891366, "acc_norm": 0.21509433962264152, "acc_norm_stderr": 0.025288394502891366 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.2152777777777778, "acc_stderr": 0.03437079344106133, "acc_norm": 0.2152777777777778, "acc_norm_stderr": 0.03437079344106133 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.17, "acc_stderr": 0.03775251680686371, "acc_norm": 0.17, "acc_norm_stderr": 0.03775251680686371 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.15, "acc_stderr": 0.035887028128263714, "acc_norm": 0.15, "acc_norm_stderr": 0.035887028128263714 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.2, "acc_stderr": 0.04020151261036846, "acc_norm": 0.2, "acc_norm_stderr": 0.04020151261036846 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.23121387283236994, "acc_stderr": 0.0321473730202947, "acc_norm": 0.23121387283236994, "acc_norm_stderr": 0.0321473730202947 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.21568627450980393, "acc_stderr": 0.04092563958237654, "acc_norm": 0.21568627450980393, "acc_norm_stderr": 0.04092563958237654 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.18, "acc_stderr": 0.03861229196653696, "acc_norm": 0.18, "acc_norm_stderr": 0.03861229196653696 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.20851063829787234, "acc_stderr": 0.026556982117838735, "acc_norm": 0.20851063829787234, "acc_norm_stderr": 0.026556982117838735 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.23684210526315788, "acc_stderr": 0.039994238792813365, "acc_norm": 0.23684210526315788, "acc_norm_stderr": 0.039994238792813365 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.2413793103448276, "acc_stderr": 0.03565998174135302, "acc_norm": 0.2413793103448276, "acc_norm_stderr": 0.03565998174135302 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.25132275132275134, "acc_stderr": 0.022340482339643898, "acc_norm": 0.25132275132275134, "acc_norm_stderr": 0.022340482339643898 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.2222222222222222, "acc_stderr": 0.03718489006818115, "acc_norm": 0.2222222222222222, "acc_norm_stderr": 0.03718489006818115 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.22, "acc_stderr": 0.04163331998932269, "acc_norm": 0.22, "acc_norm_stderr": 0.04163331998932269 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.2806451612903226, "acc_stderr": 0.02556060472102289, "acc_norm": 0.2806451612903226, "acc_norm_stderr": 0.02556060472102289 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.2857142857142857, "acc_stderr": 0.031785297106427496, "acc_norm": 0.2857142857142857, "acc_norm_stderr": 0.031785297106427496 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.27, "acc_stderr": 0.044619604333847394, "acc_norm": 0.27, "acc_norm_stderr": 0.044619604333847394 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.22424242424242424, "acc_stderr": 0.032568666616811015, "acc_norm": 0.22424242424242424, "acc_norm_stderr": 0.032568666616811015 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.2727272727272727, "acc_stderr": 0.03173071239071724, "acc_norm": 0.2727272727272727, "acc_norm_stderr": 0.03173071239071724 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.31088082901554404, "acc_stderr": 0.03340361906276587, "acc_norm": 0.31088082901554404, "acc_norm_stderr": 0.03340361906276587 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.3435897435897436, "acc_stderr": 0.024078696580635467, "acc_norm": 0.3435897435897436, "acc_norm_stderr": 0.024078696580635467 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.28888888888888886, "acc_stderr": 0.027634907264178544, "acc_norm": 0.28888888888888886, "acc_norm_stderr": 0.027634907264178544 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.24369747899159663, "acc_stderr": 0.02788682807838057, "acc_norm": 0.24369747899159663, "acc_norm_stderr": 0.02788682807838057 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.23178807947019867, "acc_stderr": 0.034454062719870546, "acc_norm": 0.23178807947019867, "acc_norm_stderr": 0.034454062719870546 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.22935779816513763, "acc_stderr": 0.018025349724618684, "acc_norm": 0.22935779816513763, "acc_norm_stderr": 0.018025349724618684 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.39351851851851855, "acc_stderr": 0.03331747876370312, "acc_norm": 0.39351851851851855, "acc_norm_stderr": 0.03331747876370312 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.23529411764705882, "acc_stderr": 0.02977177522814563, "acc_norm": 0.23529411764705882, "acc_norm_stderr": 0.02977177522814563 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.24472573839662448, "acc_stderr": 0.02798569938703642, "acc_norm": 0.24472573839662448, "acc_norm_stderr": 0.02798569938703642 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.19282511210762332, "acc_stderr": 0.02647824096048936, "acc_norm": 0.19282511210762332, "acc_norm_stderr": 0.02647824096048936 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.1984732824427481, "acc_stderr": 0.0349814938546247, "acc_norm": 0.1984732824427481, "acc_norm_stderr": 0.0349814938546247 }, "harness|hendrycksTest-international_law|5": { "acc": 0.34710743801652894, "acc_stderr": 0.04345724570292534, "acc_norm": 0.34710743801652894, "acc_norm_stderr": 0.04345724570292534 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.26851851851851855, "acc_stderr": 0.04284467968052191, "acc_norm": 0.26851851851851855, "acc_norm_stderr": 0.04284467968052191 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.2883435582822086, "acc_stderr": 0.035590395316173425, "acc_norm": 0.2883435582822086, "acc_norm_stderr": 0.035590395316173425 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.24107142857142858, "acc_stderr": 0.04059867246952685, "acc_norm": 0.24107142857142858, "acc_norm_stderr": 0.04059867246952685 }, "harness|hendrycksTest-management|5": { "acc": 0.18446601941747573, "acc_stderr": 0.03840423627288276, "acc_norm": 0.18446601941747573, "acc_norm_stderr": 0.03840423627288276 }, "harness|hendrycksTest-marketing|5": { "acc": 0.1752136752136752, "acc_stderr": 0.024904439098918225, "acc_norm": 0.1752136752136752, "acc_norm_stderr": 0.024904439098918225 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.3, "acc_stderr": 0.046056618647183814, "acc_norm": 0.3, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.27586206896551724, "acc_stderr": 0.01598281477469563, "acc_norm": 0.27586206896551724, "acc_norm_stderr": 0.01598281477469563 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.21965317919075145, "acc_stderr": 0.022289638852617897, "acc_norm": 0.21965317919075145, "acc_norm_stderr": 0.022289638852617897 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.23798882681564246, "acc_stderr": 0.014242630070574915, "acc_norm": 0.23798882681564246, "acc_norm_stderr": 0.014242630070574915 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.24509803921568626, "acc_stderr": 0.024630048979824775, "acc_norm": 0.24509803921568626, "acc_norm_stderr": 0.024630048979824775 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.3022508038585209, "acc_stderr": 0.02608270069539966, "acc_norm": 0.3022508038585209, "acc_norm_stderr": 0.02608270069539966 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.25308641975308643, "acc_stderr": 0.024191808600713002, "acc_norm": 0.25308641975308643, "acc_norm_stderr": 0.024191808600713002 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.2553191489361702, "acc_stderr": 0.026011992930902013, "acc_norm": 0.2553191489361702, "acc_norm_stderr": 0.026011992930902013 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.2457627118644068, "acc_stderr": 0.01099615663514269, "acc_norm": 0.2457627118644068, "acc_norm_stderr": 0.01099615663514269 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.18382352941176472, "acc_stderr": 0.02352924218519311, "acc_norm": 0.18382352941176472, "acc_norm_stderr": 0.02352924218519311 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.25163398692810457, "acc_stderr": 0.017555818091322256, "acc_norm": 0.25163398692810457, "acc_norm_stderr": 0.017555818091322256 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.2545454545454545, "acc_stderr": 0.041723430387053825, "acc_norm": 0.2545454545454545, "acc_norm_stderr": 0.041723430387053825 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.2612244897959184, "acc_stderr": 0.028123429335142777, "acc_norm": 0.2612244897959184, "acc_norm_stderr": 0.028123429335142777 }, "harness|hendrycksTest-sociology|5": { "acc": 0.208955223880597, "acc_stderr": 0.028748298931728655, "acc_norm": 0.208955223880597, "acc_norm_stderr": 0.028748298931728655 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.24, "acc_stderr": 0.04292346959909283, "acc_norm": 0.24, "acc_norm_stderr": 0.04292346959909283 }, "harness|hendrycksTest-virology|5": { "acc": 0.18072289156626506, "acc_stderr": 0.02995573785581014, "acc_norm": 0.18072289156626506, "acc_norm_stderr": 0.02995573785581014 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.2222222222222222, "acc_stderr": 0.03188578017686399, "acc_norm": 0.2222222222222222, "acc_norm_stderr": 0.03188578017686399 }, "harness|truthfulqa:mc|0": { "mc1": 0.2460220318237454, "mc1_stderr": 0.015077219200662595, "mc2": 0.4316446283639332, "mc2_stderr": 0.015551246607562378 }, "harness|winogrande|5": { "acc": 0.5059194948697711, "acc_stderr": 0.014051500838485807 }, "harness|gsm8k|5": { "acc": 0.0, "acc_stderr": 0.0 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_M4-ai__TinyMistral-6x248M-Instruct
[ "region:us" ]
2024-02-02T00:45:54+00:00
{"pretty_name": "Evaluation run of M4-ai/TinyMistral-6x248M-Instruct", "dataset_summary": "Dataset automatically created during the evaluation run of model [M4-ai/TinyMistral-6x248M-Instruct](https://huggingface.co/M4-ai/TinyMistral-6x248M-Instruct) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_M4-ai__TinyMistral-6x248M-Instruct\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-02T00:43:37.562563](https://huggingface.co/datasets/open-llm-leaderboard/details_M4-ai__TinyMistral-6x248M-Instruct/blob/main/results_2024-02-02T00-43-37.562563.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.24095884646371188,\n \"acc_stderr\": 0.03008991362961084,\n \"acc_norm\": 0.24148314901574003,\n \"acc_norm_stderr\": 0.03089156878834565,\n \"mc1\": 0.2460220318237454,\n \"mc1_stderr\": 0.015077219200662595,\n \"mc2\": 0.4316446283639332,\n \"mc2_stderr\": 0.015551246607562378\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.17235494880546076,\n \"acc_stderr\": 0.011037113093461295,\n \"acc_norm\": 0.22440273037542663,\n \"acc_norm_stderr\": 0.012191404938603836\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.2672774347739494,\n \"acc_stderr\": 0.004416339450436124,\n \"acc_norm\": 0.27016530571599284,\n \"acc_norm_stderr\": 0.004431375549911368\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932268,\n \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932268\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.31851851851851853,\n \"acc_stderr\": 0.040247784019771096,\n \"acc_norm\": 0.31851851851851853,\n \"acc_norm_stderr\": 0.040247784019771096\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.17763157894736842,\n \"acc_stderr\": 0.031103182383123398,\n \"acc_norm\": 0.17763157894736842,\n \"acc_norm_stderr\": 0.031103182383123398\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.19,\n \"acc_stderr\": 0.03942772444036625,\n \"acc_norm\": 0.19,\n \"acc_norm_stderr\": 0.03942772444036625\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.21509433962264152,\n \"acc_stderr\": 0.025288394502891366,\n \"acc_norm\": 0.21509433962264152,\n \"acc_norm_stderr\": 0.025288394502891366\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2152777777777778,\n \"acc_stderr\": 0.03437079344106133,\n \"acc_norm\": 0.2152777777777778,\n \"acc_norm_stderr\": 0.03437079344106133\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.17,\n \"acc_stderr\": 0.03775251680686371,\n \"acc_norm\": 0.17,\n \"acc_norm_stderr\": 0.03775251680686371\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.15,\n \"acc_stderr\": 0.035887028128263714,\n \"acc_norm\": 0.15,\n \"acc_norm_stderr\": 0.035887028128263714\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.2,\n \"acc_stderr\": 0.04020151261036846,\n \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.04020151261036846\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.23121387283236994,\n \"acc_stderr\": 0.0321473730202947,\n \"acc_norm\": 0.23121387283236994,\n \"acc_norm_stderr\": 0.0321473730202947\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.04092563958237654,\n \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.04092563958237654\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.18,\n \"acc_stderr\": 0.03861229196653696,\n \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.03861229196653696\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.20851063829787234,\n \"acc_stderr\": 0.026556982117838735,\n \"acc_norm\": 0.20851063829787234,\n \"acc_norm_stderr\": 0.026556982117838735\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.23684210526315788,\n \"acc_stderr\": 0.039994238792813365,\n \"acc_norm\": 0.23684210526315788,\n \"acc_norm_stderr\": 0.039994238792813365\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.2413793103448276,\n \"acc_stderr\": 0.03565998174135302,\n \"acc_norm\": 0.2413793103448276,\n \"acc_norm_stderr\": 0.03565998174135302\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.25132275132275134,\n \"acc_stderr\": 0.022340482339643898,\n \"acc_norm\": 0.25132275132275134,\n \"acc_norm_stderr\": 0.022340482339643898\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2222222222222222,\n \"acc_stderr\": 0.03718489006818115,\n \"acc_norm\": 0.2222222222222222,\n \"acc_norm_stderr\": 0.03718489006818115\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932269,\n \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932269\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.2806451612903226,\n \"acc_stderr\": 0.02556060472102289,\n \"acc_norm\": 0.2806451612903226,\n \"acc_norm_stderr\": 0.02556060472102289\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.2857142857142857,\n \"acc_stderr\": 0.031785297106427496,\n \"acc_norm\": 0.2857142857142857,\n \"acc_norm_stderr\": 0.031785297106427496\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.22424242424242424,\n \"acc_stderr\": 0.032568666616811015,\n \"acc_norm\": 0.22424242424242424,\n \"acc_norm_stderr\": 0.032568666616811015\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.2727272727272727,\n \"acc_stderr\": 0.03173071239071724,\n \"acc_norm\": 0.2727272727272727,\n \"acc_norm_stderr\": 0.03173071239071724\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.31088082901554404,\n \"acc_stderr\": 0.03340361906276587,\n \"acc_norm\": 0.31088082901554404,\n \"acc_norm_stderr\": 0.03340361906276587\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.3435897435897436,\n \"acc_stderr\": 0.024078696580635467,\n \"acc_norm\": 0.3435897435897436,\n \"acc_norm_stderr\": 0.024078696580635467\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.28888888888888886,\n \"acc_stderr\": 0.027634907264178544,\n \"acc_norm\": 0.28888888888888886,\n \"acc_norm_stderr\": 0.027634907264178544\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.24369747899159663,\n \"acc_stderr\": 0.02788682807838057,\n \"acc_norm\": 0.24369747899159663,\n \"acc_norm_stderr\": 0.02788682807838057\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.23178807947019867,\n \"acc_stderr\": 0.034454062719870546,\n \"acc_norm\": 0.23178807947019867,\n \"acc_norm_stderr\": 0.034454062719870546\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.22935779816513763,\n \"acc_stderr\": 0.018025349724618684,\n \"acc_norm\": 0.22935779816513763,\n \"acc_norm_stderr\": 0.018025349724618684\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.39351851851851855,\n \"acc_stderr\": 0.03331747876370312,\n \"acc_norm\": 0.39351851851851855,\n \"acc_norm_stderr\": 0.03331747876370312\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.23529411764705882,\n \"acc_stderr\": 0.02977177522814563,\n \"acc_norm\": 0.23529411764705882,\n \"acc_norm_stderr\": 0.02977177522814563\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.24472573839662448,\n \"acc_stderr\": 0.02798569938703642,\n \"acc_norm\": 0.24472573839662448,\n \"acc_norm_stderr\": 0.02798569938703642\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.19282511210762332,\n \"acc_stderr\": 0.02647824096048936,\n \"acc_norm\": 0.19282511210762332,\n \"acc_norm_stderr\": 0.02647824096048936\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.1984732824427481,\n \"acc_stderr\": 0.0349814938546247,\n \"acc_norm\": 0.1984732824427481,\n \"acc_norm_stderr\": 0.0349814938546247\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.34710743801652894,\n \"acc_stderr\": 0.04345724570292534,\n \"acc_norm\": 0.34710743801652894,\n \"acc_norm_stderr\": 0.04345724570292534\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.26851851851851855,\n \"acc_stderr\": 0.04284467968052191,\n \"acc_norm\": 0.26851851851851855,\n \"acc_norm_stderr\": 0.04284467968052191\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.2883435582822086,\n \"acc_stderr\": 0.035590395316173425,\n \"acc_norm\": 0.2883435582822086,\n \"acc_norm_stderr\": 0.035590395316173425\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.24107142857142858,\n \"acc_stderr\": 0.04059867246952685,\n \"acc_norm\": 0.24107142857142858,\n \"acc_norm_stderr\": 0.04059867246952685\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.18446601941747573,\n \"acc_stderr\": 0.03840423627288276,\n \"acc_norm\": 0.18446601941747573,\n \"acc_norm_stderr\": 0.03840423627288276\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.1752136752136752,\n \"acc_stderr\": 0.024904439098918225,\n \"acc_norm\": 0.1752136752136752,\n \"acc_norm_stderr\": 0.024904439098918225\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.27586206896551724,\n \"acc_stderr\": 0.01598281477469563,\n \"acc_norm\": 0.27586206896551724,\n \"acc_norm_stderr\": 0.01598281477469563\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.21965317919075145,\n \"acc_stderr\": 0.022289638852617897,\n \"acc_norm\": 0.21965317919075145,\n \"acc_norm_stderr\": 0.022289638852617897\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23798882681564246,\n \"acc_stderr\": 0.014242630070574915,\n \"acc_norm\": 0.23798882681564246,\n \"acc_norm_stderr\": 0.014242630070574915\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.24509803921568626,\n \"acc_stderr\": 0.024630048979824775,\n \"acc_norm\": 0.24509803921568626,\n \"acc_norm_stderr\": 0.024630048979824775\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.3022508038585209,\n \"acc_stderr\": 0.02608270069539966,\n \"acc_norm\": 0.3022508038585209,\n \"acc_norm_stderr\": 0.02608270069539966\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.25308641975308643,\n \"acc_stderr\": 0.024191808600713002,\n \"acc_norm\": 0.25308641975308643,\n \"acc_norm_stderr\": 0.024191808600713002\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.2553191489361702,\n \"acc_stderr\": 0.026011992930902013,\n \"acc_norm\": 0.2553191489361702,\n \"acc_norm_stderr\": 0.026011992930902013\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2457627118644068,\n \"acc_stderr\": 0.01099615663514269,\n \"acc_norm\": 0.2457627118644068,\n \"acc_norm_stderr\": 0.01099615663514269\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.18382352941176472,\n \"acc_stderr\": 0.02352924218519311,\n \"acc_norm\": 0.18382352941176472,\n \"acc_norm_stderr\": 0.02352924218519311\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.25163398692810457,\n \"acc_stderr\": 0.017555818091322256,\n \"acc_norm\": 0.25163398692810457,\n \"acc_norm_stderr\": 0.017555818091322256\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.2545454545454545,\n \"acc_stderr\": 0.041723430387053825,\n \"acc_norm\": 0.2545454545454545,\n \"acc_norm_stderr\": 0.041723430387053825\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.2612244897959184,\n \"acc_stderr\": 0.028123429335142777,\n \"acc_norm\": 0.2612244897959184,\n \"acc_norm_stderr\": 0.028123429335142777\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.208955223880597,\n \"acc_stderr\": 0.028748298931728655,\n \"acc_norm\": 0.208955223880597,\n \"acc_norm_stderr\": 0.028748298931728655\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909283,\n \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909283\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.18072289156626506,\n \"acc_stderr\": 0.02995573785581014,\n \"acc_norm\": 0.18072289156626506,\n \"acc_norm_stderr\": 0.02995573785581014\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.2222222222222222,\n \"acc_stderr\": 0.03188578017686399,\n \"acc_norm\": 0.2222222222222222,\n \"acc_norm_stderr\": 0.03188578017686399\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2460220318237454,\n \"mc1_stderr\": 0.015077219200662595,\n \"mc2\": 0.4316446283639332,\n \"mc2_stderr\": 0.015551246607562378\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5059194948697711,\n \"acc_stderr\": 0.014051500838485807\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n }\n}\n```", "repo_url": "https://huggingface.co/M4-ai/TinyMistral-6x248M-Instruct", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_02T00_43_37.562563", "path": ["**/details_harness|arc:challenge|25_2024-02-02T00-43-37.562563.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-02T00-43-37.562563.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_02T00_43_37.562563", "path": ["**/details_harness|gsm8k|5_2024-02-02T00-43-37.562563.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-02T00-43-37.562563.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_02T00_43_37.562563", "path": ["**/details_harness|hellaswag|10_2024-02-02T00-43-37.562563.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-02T00-43-37.562563.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_02T00_43_37.562563", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T00-43-37.562563.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-02T00-43-37.562563.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-02T00-43-37.562563.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T00-43-37.562563.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T00-43-37.562563.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-02T00-43-37.562563.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T00-43-37.562563.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T00-43-37.562563.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T00-43-37.562563.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T00-43-37.562563.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-02T00-43-37.562563.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-02T00-43-37.562563.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T00-43-37.562563.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-02T00-43-37.562563.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T00-43-37.562563.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T00-43-37.562563.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T00-43-37.562563.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-02T00-43-37.562563.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T00-43-37.562563.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T00-43-37.562563.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T00-43-37.562563.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T00-43-37.562563.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T00-43-37.562563.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T00-43-37.562563.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T00-43-37.562563.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T00-43-37.562563.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T00-43-37.562563.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T00-43-37.562563.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T00-43-37.562563.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T00-43-37.562563.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T00-43-37.562563.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T00-43-37.562563.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-02T00-43-37.562563.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T00-43-37.562563.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-02T00-43-37.562563.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T00-43-37.562563.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T00-43-37.562563.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T00-43-37.562563.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-02T00-43-37.562563.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-02T00-43-37.562563.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T00-43-37.562563.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T00-43-37.562563.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T00-43-37.562563.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T00-43-37.562563.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-02T00-43-37.562563.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-02T00-43-37.562563.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-02T00-43-37.562563.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T00-43-37.562563.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-02T00-43-37.562563.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T00-43-37.562563.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T00-43-37.562563.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-02T00-43-37.562563.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-02T00-43-37.562563.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-02T00-43-37.562563.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T00-43-37.562563.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-02T00-43-37.562563.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-02T00-43-37.562563.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T00-43-37.562563.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-02T00-43-37.562563.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-02T00-43-37.562563.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T00-43-37.562563.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T00-43-37.562563.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-02T00-43-37.562563.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T00-43-37.562563.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T00-43-37.562563.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T00-43-37.562563.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T00-43-37.562563.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-02T00-43-37.562563.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-02T00-43-37.562563.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T00-43-37.562563.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-02T00-43-37.562563.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T00-43-37.562563.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T00-43-37.562563.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T00-43-37.562563.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-02T00-43-37.562563.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T00-43-37.562563.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T00-43-37.562563.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T00-43-37.562563.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T00-43-37.562563.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T00-43-37.562563.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T00-43-37.562563.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T00-43-37.562563.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T00-43-37.562563.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T00-43-37.562563.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T00-43-37.562563.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T00-43-37.562563.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T00-43-37.562563.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T00-43-37.562563.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T00-43-37.562563.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-02T00-43-37.562563.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T00-43-37.562563.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-02T00-43-37.562563.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T00-43-37.562563.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T00-43-37.562563.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T00-43-37.562563.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-02T00-43-37.562563.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-02T00-43-37.562563.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T00-43-37.562563.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T00-43-37.562563.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T00-43-37.562563.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T00-43-37.562563.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-02T00-43-37.562563.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-02T00-43-37.562563.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-02T00-43-37.562563.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T00-43-37.562563.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-02T00-43-37.562563.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T00-43-37.562563.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T00-43-37.562563.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-02T00-43-37.562563.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-02T00-43-37.562563.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-02T00-43-37.562563.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T00-43-37.562563.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-02T00-43-37.562563.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-02T00-43-37.562563.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_02T00_43_37.562563", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T00-43-37.562563.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T00-43-37.562563.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_02T00_43_37.562563", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-02T00-43-37.562563.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-02T00-43-37.562563.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_02T00_43_37.562563", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-02T00-43-37.562563.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-02T00-43-37.562563.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_02T00_43_37.562563", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T00-43-37.562563.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T00-43-37.562563.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_02T00_43_37.562563", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T00-43-37.562563.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T00-43-37.562563.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_02T00_43_37.562563", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-02T00-43-37.562563.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-02T00-43-37.562563.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_02T00_43_37.562563", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T00-43-37.562563.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T00-43-37.562563.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_02T00_43_37.562563", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T00-43-37.562563.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T00-43-37.562563.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_02T00_43_37.562563", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T00-43-37.562563.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T00-43-37.562563.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_02T00_43_37.562563", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T00-43-37.562563.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T00-43-37.562563.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_02T00_43_37.562563", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-02T00-43-37.562563.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-02T00-43-37.562563.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_02T00_43_37.562563", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-02T00-43-37.562563.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-02T00-43-37.562563.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_02T00_43_37.562563", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T00-43-37.562563.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T00-43-37.562563.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_02T00_43_37.562563", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-02T00-43-37.562563.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-02T00-43-37.562563.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_02T00_43_37.562563", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T00-43-37.562563.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T00-43-37.562563.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_02T00_43_37.562563", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T00-43-37.562563.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T00-43-37.562563.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_02T00_43_37.562563", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T00-43-37.562563.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T00-43-37.562563.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_02T00_43_37.562563", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-02T00-43-37.562563.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-02T00-43-37.562563.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_02T00_43_37.562563", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T00-43-37.562563.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T00-43-37.562563.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_02T00_43_37.562563", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T00-43-37.562563.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T00-43-37.562563.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_02T00_43_37.562563", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T00-43-37.562563.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T00-43-37.562563.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_02T00_43_37.562563", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T00-43-37.562563.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T00-43-37.562563.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_02T00_43_37.562563", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T00-43-37.562563.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T00-43-37.562563.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_02T00_43_37.562563", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T00-43-37.562563.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T00-43-37.562563.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_02T00_43_37.562563", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T00-43-37.562563.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T00-43-37.562563.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_02T00_43_37.562563", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T00-43-37.562563.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T00-43-37.562563.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_02T00_43_37.562563", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T00-43-37.562563.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T00-43-37.562563.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_02T00_43_37.562563", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T00-43-37.562563.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T00-43-37.562563.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_02T00_43_37.562563", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T00-43-37.562563.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T00-43-37.562563.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_02T00_43_37.562563", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T00-43-37.562563.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T00-43-37.562563.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_02T00_43_37.562563", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T00-43-37.562563.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T00-43-37.562563.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_02T00_43_37.562563", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T00-43-37.562563.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T00-43-37.562563.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_02T00_43_37.562563", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-02T00-43-37.562563.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-02T00-43-37.562563.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_02T00_43_37.562563", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T00-43-37.562563.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T00-43-37.562563.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_02T00_43_37.562563", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-02T00-43-37.562563.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-02T00-43-37.562563.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_02T00_43_37.562563", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T00-43-37.562563.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T00-43-37.562563.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_02T00_43_37.562563", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T00-43-37.562563.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T00-43-37.562563.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_02T00_43_37.562563", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T00-43-37.562563.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T00-43-37.562563.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_02T00_43_37.562563", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-02T00-43-37.562563.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-02T00-43-37.562563.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_02T00_43_37.562563", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-02T00-43-37.562563.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-02T00-43-37.562563.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_02T00_43_37.562563", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T00-43-37.562563.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T00-43-37.562563.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_02T00_43_37.562563", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T00-43-37.562563.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T00-43-37.562563.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_02T00_43_37.562563", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T00-43-37.562563.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T00-43-37.562563.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_02T00_43_37.562563", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T00-43-37.562563.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T00-43-37.562563.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_02T00_43_37.562563", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-02T00-43-37.562563.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-02T00-43-37.562563.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_02T00_43_37.562563", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-02T00-43-37.562563.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-02T00-43-37.562563.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_02T00_43_37.562563", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-02T00-43-37.562563.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-02T00-43-37.562563.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_02T00_43_37.562563", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T00-43-37.562563.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T00-43-37.562563.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_02T00_43_37.562563", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-02T00-43-37.562563.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-02T00-43-37.562563.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_02T00_43_37.562563", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T00-43-37.562563.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T00-43-37.562563.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_02T00_43_37.562563", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T00-43-37.562563.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T00-43-37.562563.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_02T00_43_37.562563", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-02T00-43-37.562563.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-02T00-43-37.562563.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_02T00_43_37.562563", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-02T00-43-37.562563.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-02T00-43-37.562563.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_02T00_43_37.562563", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-02T00-43-37.562563.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-02T00-43-37.562563.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_02T00_43_37.562563", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T00-43-37.562563.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T00-43-37.562563.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_02T00_43_37.562563", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-02T00-43-37.562563.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-02T00-43-37.562563.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_02T00_43_37.562563", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-02T00-43-37.562563.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-02T00-43-37.562563.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_02T00_43_37.562563", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-02T00-43-37.562563.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-02T00-43-37.562563.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_02T00_43_37.562563", "path": ["**/details_harness|winogrande|5_2024-02-02T00-43-37.562563.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-02T00-43-37.562563.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_02T00_43_37.562563", "path": ["results_2024-02-02T00-43-37.562563.parquet"]}, {"split": "latest", "path": ["results_2024-02-02T00-43-37.562563.parquet"]}]}]}
2024-02-02T00:46:21+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of M4-ai/TinyMistral-6x248M-Instruct Dataset automatically created during the evaluation run of model M4-ai/TinyMistral-6x248M-Instruct on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-02-02T00:43:37.562563(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of M4-ai/TinyMistral-6x248M-Instruct\n\n\n\nDataset automatically created during the evaluation run of model M4-ai/TinyMistral-6x248M-Instruct on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-02-02T00:43:37.562563(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of M4-ai/TinyMistral-6x248M-Instruct\n\n\n\nDataset automatically created during the evaluation run of model M4-ai/TinyMistral-6x248M-Instruct on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-02-02T00:43:37.562563(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
4dcfcf452540c62a46d07764bbb0cd0bfa74b1c0
# Dataset Card for Evaluation run of AIJUUD/juud-Mistral-7B <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [AIJUUD/juud-Mistral-7B](https://huggingface.co/AIJUUD/juud-Mistral-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_AIJUUD__juud-Mistral-7B", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-02-02T00:46:47.329333](https://huggingface.co/datasets/open-llm-leaderboard/details_AIJUUD__juud-Mistral-7B/blob/main/results_2024-02-02T00-46-47.329333.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6298199576276905, "acc_stderr": 0.032307298049248236, "acc_norm": 0.6379923251397444, "acc_norm_stderr": 0.032981988953013575, "mc1": 0.3684210526315789, "mc1_stderr": 0.016886551261046046, "mc2": 0.5412316525132941, "mc2_stderr": 0.015338639083594787 }, "harness|arc:challenge|25": { "acc": 0.6245733788395904, "acc_stderr": 0.014150631435111726, "acc_norm": 0.6672354948805461, "acc_norm_stderr": 0.013769863046192307 }, "harness|hellaswag|10": { "acc": 0.6591316470822546, "acc_stderr": 0.004730324556624128, "acc_norm": 0.8500298745269866, "acc_norm_stderr": 0.003563124427458522 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.32, "acc_stderr": 0.046882617226215034, "acc_norm": 0.32, "acc_norm_stderr": 0.046882617226215034 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.5703703703703704, "acc_stderr": 0.042763494943765995, "acc_norm": 0.5703703703703704, "acc_norm_stderr": 0.042763494943765995 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.7171052631578947, "acc_stderr": 0.03665349695640767, "acc_norm": 0.7171052631578947, "acc_norm_stderr": 0.03665349695640767 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.59, "acc_stderr": 0.049431107042371025, "acc_norm": 0.59, "acc_norm_stderr": 0.049431107042371025 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.6867924528301886, "acc_stderr": 0.02854479331905533, "acc_norm": 0.6867924528301886, "acc_norm_stderr": 0.02854479331905533 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.7638888888888888, "acc_stderr": 0.03551446610810826, "acc_norm": 0.7638888888888888, "acc_norm_stderr": 0.03551446610810826 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.44, "acc_stderr": 0.04988876515698589, "acc_norm": 0.44, "acc_norm_stderr": 0.04988876515698589 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.45, "acc_stderr": 0.05, "acc_norm": 0.45, "acc_norm_stderr": 0.05 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.28, "acc_stderr": 0.04512608598542127, "acc_norm": 0.28, "acc_norm_stderr": 0.04512608598542127 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6184971098265896, "acc_stderr": 0.03703851193099521, "acc_norm": 0.6184971098265896, "acc_norm_stderr": 0.03703851193099521 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.4117647058823529, "acc_stderr": 0.04897104952726366, "acc_norm": 0.4117647058823529, "acc_norm_stderr": 0.04897104952726366 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.76, "acc_stderr": 0.042923469599092816, "acc_norm": 0.76, "acc_norm_stderr": 0.042923469599092816 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.5531914893617021, "acc_stderr": 0.0325005368436584, "acc_norm": 0.5531914893617021, "acc_norm_stderr": 0.0325005368436584 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.49122807017543857, "acc_stderr": 0.04702880432049615, "acc_norm": 0.49122807017543857, "acc_norm_stderr": 0.04702880432049615 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5310344827586206, "acc_stderr": 0.04158632762097828, "acc_norm": 0.5310344827586206, "acc_norm_stderr": 0.04158632762097828 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.42063492063492064, "acc_stderr": 0.025424835086923996, "acc_norm": 0.42063492063492064, "acc_norm_stderr": 0.025424835086923996 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.46825396825396826, "acc_stderr": 0.04463112720677172, "acc_norm": 0.46825396825396826, "acc_norm_stderr": 0.04463112720677172 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.37, "acc_stderr": 0.04852365870939099, "acc_norm": 0.37, "acc_norm_stderr": 0.04852365870939099 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.7774193548387097, "acc_stderr": 0.023664216671642507, "acc_norm": 0.7774193548387097, "acc_norm_stderr": 0.023664216671642507 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.5172413793103449, "acc_stderr": 0.035158955511656986, "acc_norm": 0.5172413793103449, "acc_norm_stderr": 0.035158955511656986 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.67, "acc_stderr": 0.04725815626252607, "acc_norm": 0.67, "acc_norm_stderr": 0.04725815626252607 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.793939393939394, "acc_stderr": 0.031584153240477114, "acc_norm": 0.793939393939394, "acc_norm_stderr": 0.031584153240477114 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.797979797979798, "acc_stderr": 0.028606204289229862, "acc_norm": 0.797979797979798, "acc_norm_stderr": 0.028606204289229862 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.8911917098445595, "acc_stderr": 0.022473253332768776, "acc_norm": 0.8911917098445595, "acc_norm_stderr": 0.022473253332768776 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.6102564102564103, "acc_stderr": 0.024726967886647074, "acc_norm": 0.6102564102564103, "acc_norm_stderr": 0.024726967886647074 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.3, "acc_stderr": 0.027940457136228405, "acc_norm": 0.3, "acc_norm_stderr": 0.027940457136228405 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.6722689075630253, "acc_stderr": 0.03048991141767323, "acc_norm": 0.6722689075630253, "acc_norm_stderr": 0.03048991141767323 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.31788079470198677, "acc_stderr": 0.038020397601079024, "acc_norm": 0.31788079470198677, "acc_norm_stderr": 0.038020397601079024 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8238532110091743, "acc_stderr": 0.016332882393431367, "acc_norm": 0.8238532110091743, "acc_norm_stderr": 0.016332882393431367 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.5, "acc_stderr": 0.034099716973523674, "acc_norm": 0.5, "acc_norm_stderr": 0.034099716973523674 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.7892156862745098, "acc_stderr": 0.02862654791243741, "acc_norm": 0.7892156862745098, "acc_norm_stderr": 0.02862654791243741 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.8016877637130801, "acc_stderr": 0.02595502084162113, "acc_norm": 0.8016877637130801, "acc_norm_stderr": 0.02595502084162113 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.7085201793721974, "acc_stderr": 0.030500283176545843, "acc_norm": 0.7085201793721974, "acc_norm_stderr": 0.030500283176545843 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.7862595419847328, "acc_stderr": 0.0359546161177469, "acc_norm": 0.7862595419847328, "acc_norm_stderr": 0.0359546161177469 }, "harness|hendrycksTest-international_law|5": { "acc": 0.768595041322314, "acc_stderr": 0.03849856098794088, "acc_norm": 0.768595041322314, "acc_norm_stderr": 0.03849856098794088 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7777777777777778, "acc_stderr": 0.0401910747255735, "acc_norm": 0.7777777777777778, "acc_norm_stderr": 0.0401910747255735 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7975460122699386, "acc_stderr": 0.031570650789119005, "acc_norm": 0.7975460122699386, "acc_norm_stderr": 0.031570650789119005 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.49107142857142855, "acc_stderr": 0.04745033255489123, "acc_norm": 0.49107142857142855, "acc_norm_stderr": 0.04745033255489123 }, "harness|hendrycksTest-management|5": { "acc": 0.7572815533980582, "acc_stderr": 0.04245022486384495, "acc_norm": 0.7572815533980582, "acc_norm_stderr": 0.04245022486384495 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8504273504273504, "acc_stderr": 0.023365051491753715, "acc_norm": 0.8504273504273504, "acc_norm_stderr": 0.023365051491753715 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.71, "acc_stderr": 0.045604802157206845, "acc_norm": 0.71, "acc_norm_stderr": 0.045604802157206845 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8186462324393359, "acc_stderr": 0.013778693778464076, "acc_norm": 0.8186462324393359, "acc_norm_stderr": 0.013778693778464076 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.7254335260115607, "acc_stderr": 0.02402774515526502, "acc_norm": 0.7254335260115607, "acc_norm_stderr": 0.02402774515526502 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.3206703910614525, "acc_stderr": 0.015609929559348397, "acc_norm": 0.3206703910614525, "acc_norm_stderr": 0.015609929559348397 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.7483660130718954, "acc_stderr": 0.024848018263875192, "acc_norm": 0.7483660130718954, "acc_norm_stderr": 0.024848018263875192 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.6816720257234726, "acc_stderr": 0.026457225067811025, "acc_norm": 0.6816720257234726, "acc_norm_stderr": 0.026457225067811025 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.7438271604938271, "acc_stderr": 0.024288533637726095, "acc_norm": 0.7438271604938271, "acc_norm_stderr": 0.024288533637726095 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.5106382978723404, "acc_stderr": 0.02982074719142244, "acc_norm": 0.5106382978723404, "acc_norm_stderr": 0.02982074719142244 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.46284224250325945, "acc_stderr": 0.012734923579532072, "acc_norm": 0.46284224250325945, "acc_norm_stderr": 0.012734923579532072 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.6691176470588235, "acc_stderr": 0.02858270975389845, "acc_norm": 0.6691176470588235, "acc_norm_stderr": 0.02858270975389845 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6650326797385621, "acc_stderr": 0.019094228167000318, "acc_norm": 0.6650326797385621, "acc_norm_stderr": 0.019094228167000318 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6545454545454545, "acc_stderr": 0.04554619617541054, "acc_norm": 0.6545454545454545, "acc_norm_stderr": 0.04554619617541054 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.7346938775510204, "acc_stderr": 0.028263889943784593, "acc_norm": 0.7346938775510204, "acc_norm_stderr": 0.028263889943784593 }, "harness|hendrycksTest-sociology|5": { "acc": 0.8009950248756219, "acc_stderr": 0.028231365092758406, "acc_norm": 0.8009950248756219, "acc_norm_stderr": 0.028231365092758406 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.85, "acc_stderr": 0.0358870281282637, "acc_norm": 0.85, "acc_norm_stderr": 0.0358870281282637 }, "harness|hendrycksTest-virology|5": { "acc": 0.5542168674698795, "acc_stderr": 0.03869543323472101, "acc_norm": 0.5542168674698795, "acc_norm_stderr": 0.03869543323472101 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8304093567251462, "acc_stderr": 0.02878210810540171, "acc_norm": 0.8304093567251462, "acc_norm_stderr": 0.02878210810540171 }, "harness|truthfulqa:mc|0": { "mc1": 0.3684210526315789, "mc1_stderr": 0.016886551261046046, "mc2": 0.5412316525132941, "mc2_stderr": 0.015338639083594787 }, "harness|winogrande|5": { "acc": 0.7797947908445146, "acc_stderr": 0.01164627675508969 }, "harness|gsm8k|5": { "acc": 0.2312357846853677, "acc_stderr": 0.011613587503166618 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_AIJUUD__juud-Mistral-7B
[ "region:us" ]
2024-02-02T00:49:03+00:00
{"pretty_name": "Evaluation run of AIJUUD/juud-Mistral-7B", "dataset_summary": "Dataset automatically created during the evaluation run of model [AIJUUD/juud-Mistral-7B](https://huggingface.co/AIJUUD/juud-Mistral-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_AIJUUD__juud-Mistral-7B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-02T00:46:47.329333](https://huggingface.co/datasets/open-llm-leaderboard/details_AIJUUD__juud-Mistral-7B/blob/main/results_2024-02-02T00-46-47.329333.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6298199576276905,\n \"acc_stderr\": 0.032307298049248236,\n \"acc_norm\": 0.6379923251397444,\n \"acc_norm_stderr\": 0.032981988953013575,\n \"mc1\": 0.3684210526315789,\n \"mc1_stderr\": 0.016886551261046046,\n \"mc2\": 0.5412316525132941,\n \"mc2_stderr\": 0.015338639083594787\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6245733788395904,\n \"acc_stderr\": 0.014150631435111726,\n \"acc_norm\": 0.6672354948805461,\n \"acc_norm_stderr\": 0.013769863046192307\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6591316470822546,\n \"acc_stderr\": 0.004730324556624128,\n \"acc_norm\": 0.8500298745269866,\n \"acc_norm_stderr\": 0.003563124427458522\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5703703703703704,\n \"acc_stderr\": 0.042763494943765995,\n \"acc_norm\": 0.5703703703703704,\n \"acc_norm_stderr\": 0.042763494943765995\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7171052631578947,\n \"acc_stderr\": 0.03665349695640767,\n \"acc_norm\": 0.7171052631578947,\n \"acc_norm_stderr\": 0.03665349695640767\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.59,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.59,\n \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6867924528301886,\n \"acc_stderr\": 0.02854479331905533,\n \"acc_norm\": 0.6867924528301886,\n \"acc_norm_stderr\": 0.02854479331905533\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7638888888888888,\n \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.7638888888888888,\n \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6184971098265896,\n \"acc_stderr\": 0.03703851193099521,\n \"acc_norm\": 0.6184971098265896,\n \"acc_norm_stderr\": 0.03703851193099521\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.04897104952726366,\n \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.04897104952726366\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5531914893617021,\n \"acc_stderr\": 0.0325005368436584,\n \"acc_norm\": 0.5531914893617021,\n \"acc_norm_stderr\": 0.0325005368436584\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.49122807017543857,\n \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.49122807017543857,\n \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5310344827586206,\n \"acc_stderr\": 0.04158632762097828,\n \"acc_norm\": 0.5310344827586206,\n \"acc_norm_stderr\": 0.04158632762097828\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.42063492063492064,\n \"acc_stderr\": 0.025424835086923996,\n \"acc_norm\": 0.42063492063492064,\n \"acc_norm_stderr\": 0.025424835086923996\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.46825396825396826,\n \"acc_stderr\": 0.04463112720677172,\n \"acc_norm\": 0.46825396825396826,\n \"acc_norm_stderr\": 0.04463112720677172\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7774193548387097,\n \"acc_stderr\": 0.023664216671642507,\n \"acc_norm\": 0.7774193548387097,\n \"acc_norm_stderr\": 0.023664216671642507\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.035158955511656986,\n \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.035158955511656986\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.67,\n \"acc_stderr\": 0.04725815626252607,\n \"acc_norm\": 0.67,\n \"acc_norm_stderr\": 0.04725815626252607\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.793939393939394,\n \"acc_stderr\": 0.031584153240477114,\n \"acc_norm\": 0.793939393939394,\n \"acc_norm_stderr\": 0.031584153240477114\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.797979797979798,\n \"acc_stderr\": 0.028606204289229862,\n \"acc_norm\": 0.797979797979798,\n \"acc_norm_stderr\": 0.028606204289229862\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8911917098445595,\n \"acc_stderr\": 0.022473253332768776,\n \"acc_norm\": 0.8911917098445595,\n \"acc_norm_stderr\": 0.022473253332768776\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6102564102564103,\n \"acc_stderr\": 0.024726967886647074,\n \"acc_norm\": 0.6102564102564103,\n \"acc_norm_stderr\": 0.024726967886647074\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.027940457136228405,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.027940457136228405\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6722689075630253,\n \"acc_stderr\": 0.03048991141767323,\n \"acc_norm\": 0.6722689075630253,\n \"acc_norm_stderr\": 0.03048991141767323\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.31788079470198677,\n \"acc_stderr\": 0.038020397601079024,\n \"acc_norm\": 0.31788079470198677,\n \"acc_norm_stderr\": 0.038020397601079024\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8238532110091743,\n \"acc_stderr\": 0.016332882393431367,\n \"acc_norm\": 0.8238532110091743,\n \"acc_norm_stderr\": 0.016332882393431367\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.034099716973523674,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.034099716973523674\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7892156862745098,\n \"acc_stderr\": 0.02862654791243741,\n \"acc_norm\": 0.7892156862745098,\n \"acc_norm_stderr\": 0.02862654791243741\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8016877637130801,\n \"acc_stderr\": 0.02595502084162113,\n \"acc_norm\": 0.8016877637130801,\n \"acc_norm_stderr\": 0.02595502084162113\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7085201793721974,\n \"acc_stderr\": 0.030500283176545843,\n \"acc_norm\": 0.7085201793721974,\n \"acc_norm_stderr\": 0.030500283176545843\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7862595419847328,\n \"acc_stderr\": 0.0359546161177469,\n \"acc_norm\": 0.7862595419847328,\n \"acc_norm_stderr\": 0.0359546161177469\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.768595041322314,\n \"acc_stderr\": 0.03849856098794088,\n \"acc_norm\": 0.768595041322314,\n \"acc_norm_stderr\": 0.03849856098794088\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.0401910747255735,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.0401910747255735\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7975460122699386,\n \"acc_stderr\": 0.031570650789119005,\n \"acc_norm\": 0.7975460122699386,\n \"acc_norm_stderr\": 0.031570650789119005\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.49107142857142855,\n \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.49107142857142855,\n \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384495,\n \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384495\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8504273504273504,\n \"acc_stderr\": 0.023365051491753715,\n \"acc_norm\": 0.8504273504273504,\n \"acc_norm_stderr\": 0.023365051491753715\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8186462324393359,\n \"acc_stderr\": 0.013778693778464076,\n \"acc_norm\": 0.8186462324393359,\n \"acc_norm_stderr\": 0.013778693778464076\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7254335260115607,\n \"acc_stderr\": 0.02402774515526502,\n \"acc_norm\": 0.7254335260115607,\n \"acc_norm_stderr\": 0.02402774515526502\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3206703910614525,\n \"acc_stderr\": 0.015609929559348397,\n \"acc_norm\": 0.3206703910614525,\n \"acc_norm_stderr\": 0.015609929559348397\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7483660130718954,\n \"acc_stderr\": 0.024848018263875192,\n \"acc_norm\": 0.7483660130718954,\n \"acc_norm_stderr\": 0.024848018263875192\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6816720257234726,\n \"acc_stderr\": 0.026457225067811025,\n \"acc_norm\": 0.6816720257234726,\n \"acc_norm_stderr\": 0.026457225067811025\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7438271604938271,\n \"acc_stderr\": 0.024288533637726095,\n \"acc_norm\": 0.7438271604938271,\n \"acc_norm_stderr\": 0.024288533637726095\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.5106382978723404,\n \"acc_stderr\": 0.02982074719142244,\n \"acc_norm\": 0.5106382978723404,\n \"acc_norm_stderr\": 0.02982074719142244\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46284224250325945,\n \"acc_stderr\": 0.012734923579532072,\n \"acc_norm\": 0.46284224250325945,\n \"acc_norm_stderr\": 0.012734923579532072\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6691176470588235,\n \"acc_stderr\": 0.02858270975389845,\n \"acc_norm\": 0.6691176470588235,\n \"acc_norm_stderr\": 0.02858270975389845\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6650326797385621,\n \"acc_stderr\": 0.019094228167000318,\n \"acc_norm\": 0.6650326797385621,\n \"acc_norm_stderr\": 0.019094228167000318\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7346938775510204,\n \"acc_stderr\": 0.028263889943784593,\n \"acc_norm\": 0.7346938775510204,\n \"acc_norm_stderr\": 0.028263889943784593\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8009950248756219,\n \"acc_stderr\": 0.028231365092758406,\n \"acc_norm\": 0.8009950248756219,\n \"acc_norm_stderr\": 0.028231365092758406\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.85,\n \"acc_stderr\": 0.0358870281282637,\n \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.0358870281282637\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5542168674698795,\n \"acc_stderr\": 0.03869543323472101,\n \"acc_norm\": 0.5542168674698795,\n \"acc_norm_stderr\": 0.03869543323472101\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3684210526315789,\n \"mc1_stderr\": 0.016886551261046046,\n \"mc2\": 0.5412316525132941,\n \"mc2_stderr\": 0.015338639083594787\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7797947908445146,\n \"acc_stderr\": 0.01164627675508969\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.2312357846853677,\n \"acc_stderr\": 0.011613587503166618\n }\n}\n```", "repo_url": "https://huggingface.co/AIJUUD/juud-Mistral-7B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_02T00_46_47.329333", "path": ["**/details_harness|arc:challenge|25_2024-02-02T00-46-47.329333.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-02T00-46-47.329333.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_02T00_46_47.329333", "path": ["**/details_harness|gsm8k|5_2024-02-02T00-46-47.329333.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-02T00-46-47.329333.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_02T00_46_47.329333", "path": ["**/details_harness|hellaswag|10_2024-02-02T00-46-47.329333.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-02T00-46-47.329333.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_02T00_46_47.329333", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T00-46-47.329333.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-02T00-46-47.329333.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-02T00-46-47.329333.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T00-46-47.329333.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T00-46-47.329333.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-02T00-46-47.329333.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T00-46-47.329333.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T00-46-47.329333.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T00-46-47.329333.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T00-46-47.329333.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-02T00-46-47.329333.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-02T00-46-47.329333.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T00-46-47.329333.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-02T00-46-47.329333.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T00-46-47.329333.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T00-46-47.329333.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T00-46-47.329333.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-02T00-46-47.329333.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T00-46-47.329333.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T00-46-47.329333.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T00-46-47.329333.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T00-46-47.329333.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T00-46-47.329333.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T00-46-47.329333.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T00-46-47.329333.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T00-46-47.329333.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T00-46-47.329333.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T00-46-47.329333.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T00-46-47.329333.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T00-46-47.329333.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T00-46-47.329333.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T00-46-47.329333.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-02T00-46-47.329333.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T00-46-47.329333.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-02T00-46-47.329333.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T00-46-47.329333.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T00-46-47.329333.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T00-46-47.329333.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-02T00-46-47.329333.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-02T00-46-47.329333.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T00-46-47.329333.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T00-46-47.329333.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T00-46-47.329333.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T00-46-47.329333.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-02T00-46-47.329333.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-02T00-46-47.329333.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-02T00-46-47.329333.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T00-46-47.329333.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-02T00-46-47.329333.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T00-46-47.329333.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T00-46-47.329333.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-02T00-46-47.329333.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-02T00-46-47.329333.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-02T00-46-47.329333.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T00-46-47.329333.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-02T00-46-47.329333.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-02T00-46-47.329333.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T00-46-47.329333.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-02T00-46-47.329333.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-02T00-46-47.329333.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T00-46-47.329333.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T00-46-47.329333.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-02T00-46-47.329333.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T00-46-47.329333.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T00-46-47.329333.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T00-46-47.329333.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T00-46-47.329333.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-02T00-46-47.329333.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-02T00-46-47.329333.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T00-46-47.329333.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-02T00-46-47.329333.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T00-46-47.329333.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T00-46-47.329333.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T00-46-47.329333.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-02T00-46-47.329333.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T00-46-47.329333.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T00-46-47.329333.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T00-46-47.329333.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T00-46-47.329333.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T00-46-47.329333.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T00-46-47.329333.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T00-46-47.329333.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T00-46-47.329333.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T00-46-47.329333.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T00-46-47.329333.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T00-46-47.329333.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T00-46-47.329333.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T00-46-47.329333.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T00-46-47.329333.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-02T00-46-47.329333.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T00-46-47.329333.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-02T00-46-47.329333.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T00-46-47.329333.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T00-46-47.329333.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T00-46-47.329333.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-02T00-46-47.329333.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-02T00-46-47.329333.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T00-46-47.329333.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T00-46-47.329333.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T00-46-47.329333.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T00-46-47.329333.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-02T00-46-47.329333.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-02T00-46-47.329333.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-02T00-46-47.329333.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T00-46-47.329333.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-02T00-46-47.329333.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T00-46-47.329333.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T00-46-47.329333.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-02T00-46-47.329333.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-02T00-46-47.329333.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-02T00-46-47.329333.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T00-46-47.329333.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-02T00-46-47.329333.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-02T00-46-47.329333.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_02T00_46_47.329333", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T00-46-47.329333.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T00-46-47.329333.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_02T00_46_47.329333", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-02T00-46-47.329333.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-02T00-46-47.329333.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_02T00_46_47.329333", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-02T00-46-47.329333.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-02T00-46-47.329333.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_02T00_46_47.329333", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T00-46-47.329333.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T00-46-47.329333.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_02T00_46_47.329333", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T00-46-47.329333.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T00-46-47.329333.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_02T00_46_47.329333", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-02T00-46-47.329333.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-02T00-46-47.329333.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_02T00_46_47.329333", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T00-46-47.329333.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T00-46-47.329333.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_02T00_46_47.329333", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T00-46-47.329333.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T00-46-47.329333.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_02T00_46_47.329333", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T00-46-47.329333.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T00-46-47.329333.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_02T00_46_47.329333", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T00-46-47.329333.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T00-46-47.329333.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_02T00_46_47.329333", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-02T00-46-47.329333.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-02T00-46-47.329333.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_02T00_46_47.329333", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-02T00-46-47.329333.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-02T00-46-47.329333.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_02T00_46_47.329333", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T00-46-47.329333.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T00-46-47.329333.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_02T00_46_47.329333", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-02T00-46-47.329333.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-02T00-46-47.329333.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_02T00_46_47.329333", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T00-46-47.329333.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T00-46-47.329333.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_02T00_46_47.329333", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T00-46-47.329333.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T00-46-47.329333.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_02T00_46_47.329333", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T00-46-47.329333.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T00-46-47.329333.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_02T00_46_47.329333", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-02T00-46-47.329333.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-02T00-46-47.329333.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_02T00_46_47.329333", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T00-46-47.329333.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T00-46-47.329333.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_02T00_46_47.329333", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T00-46-47.329333.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T00-46-47.329333.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_02T00_46_47.329333", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T00-46-47.329333.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T00-46-47.329333.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_02T00_46_47.329333", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T00-46-47.329333.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T00-46-47.329333.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_02T00_46_47.329333", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T00-46-47.329333.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T00-46-47.329333.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_02T00_46_47.329333", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T00-46-47.329333.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T00-46-47.329333.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_02T00_46_47.329333", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T00-46-47.329333.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T00-46-47.329333.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_02T00_46_47.329333", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T00-46-47.329333.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T00-46-47.329333.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_02T00_46_47.329333", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T00-46-47.329333.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T00-46-47.329333.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_02T00_46_47.329333", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T00-46-47.329333.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T00-46-47.329333.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_02T00_46_47.329333", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T00-46-47.329333.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T00-46-47.329333.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_02T00_46_47.329333", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T00-46-47.329333.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T00-46-47.329333.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_02T00_46_47.329333", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T00-46-47.329333.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T00-46-47.329333.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_02T00_46_47.329333", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T00-46-47.329333.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T00-46-47.329333.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_02T00_46_47.329333", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-02T00-46-47.329333.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-02T00-46-47.329333.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_02T00_46_47.329333", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T00-46-47.329333.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T00-46-47.329333.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_02T00_46_47.329333", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-02T00-46-47.329333.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-02T00-46-47.329333.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_02T00_46_47.329333", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T00-46-47.329333.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T00-46-47.329333.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_02T00_46_47.329333", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T00-46-47.329333.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T00-46-47.329333.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_02T00_46_47.329333", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T00-46-47.329333.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T00-46-47.329333.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_02T00_46_47.329333", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-02T00-46-47.329333.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-02T00-46-47.329333.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_02T00_46_47.329333", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-02T00-46-47.329333.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-02T00-46-47.329333.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_02T00_46_47.329333", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T00-46-47.329333.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T00-46-47.329333.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_02T00_46_47.329333", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T00-46-47.329333.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T00-46-47.329333.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_02T00_46_47.329333", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T00-46-47.329333.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T00-46-47.329333.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_02T00_46_47.329333", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T00-46-47.329333.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T00-46-47.329333.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_02T00_46_47.329333", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-02T00-46-47.329333.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-02T00-46-47.329333.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_02T00_46_47.329333", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-02T00-46-47.329333.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-02T00-46-47.329333.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_02T00_46_47.329333", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-02T00-46-47.329333.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-02T00-46-47.329333.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_02T00_46_47.329333", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T00-46-47.329333.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T00-46-47.329333.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_02T00_46_47.329333", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-02T00-46-47.329333.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-02T00-46-47.329333.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_02T00_46_47.329333", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T00-46-47.329333.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T00-46-47.329333.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_02T00_46_47.329333", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T00-46-47.329333.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T00-46-47.329333.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_02T00_46_47.329333", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-02T00-46-47.329333.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-02T00-46-47.329333.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_02T00_46_47.329333", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-02T00-46-47.329333.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-02T00-46-47.329333.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_02T00_46_47.329333", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-02T00-46-47.329333.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-02T00-46-47.329333.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_02T00_46_47.329333", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T00-46-47.329333.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T00-46-47.329333.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_02T00_46_47.329333", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-02T00-46-47.329333.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-02T00-46-47.329333.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_02T00_46_47.329333", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-02T00-46-47.329333.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-02T00-46-47.329333.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_02T00_46_47.329333", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-02T00-46-47.329333.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-02T00-46-47.329333.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_02T00_46_47.329333", "path": ["**/details_harness|winogrande|5_2024-02-02T00-46-47.329333.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-02T00-46-47.329333.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_02T00_46_47.329333", "path": ["results_2024-02-02T00-46-47.329333.parquet"]}, {"split": "latest", "path": ["results_2024-02-02T00-46-47.329333.parquet"]}]}]}
2024-02-02T00:49:28+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of AIJUUD/juud-Mistral-7B Dataset automatically created during the evaluation run of model AIJUUD/juud-Mistral-7B on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-02-02T00:46:47.329333(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of AIJUUD/juud-Mistral-7B\n\n\n\nDataset automatically created during the evaluation run of model AIJUUD/juud-Mistral-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-02-02T00:46:47.329333(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of AIJUUD/juud-Mistral-7B\n\n\n\nDataset automatically created during the evaluation run of model AIJUUD/juud-Mistral-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-02-02T00:46:47.329333(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
772fb29780b3184b10a11cc2f0238af2cf7326aa
## LOGIC-701 Benchmark This is a **synthetic** and **filtered** dataset for benchmarking large language models (LLMs). It consists of 701 medium and hard **logic puzzles** with solutions on 10 distinct topics. A feature of the dataset is that it tests exclusively logical/reasoning abilities, offering only 5 answer options. \ There are no or very few tasks in the dataset that require external knowledge about events, people, facts, etc. ### Languages This benchmark is also part of an initiative to evaluate the ability of models to think in different languages. \ So it was generated in English and translated into Russian, preserving the meaning as much as possible. ### Creation The dataset was created in 5 stages: 1. Generating a large number of tasks with answer choices on given topics in English using **gpt-4-1106-preview** 2. Deduplication and condition filtering using **intfloat/e5-large-v2** 3. 4 independent runs of **gpt-4-1106-preview** were conducted to determine the best answer for each task 4. All tasks for which **gpt-4-1106-preview** diverged in answers from 2 times in 4 independent generations were removed 5. Translation of the problems into Russian, also using **gpt-4-1106-preview**, to create a parallel corpus ### Topics of logic puzzles | Topic | Count | |--------------------------------------|-------| | Probability and statistics | 120 | | Spatial reasoning | 118 | | Optimization of actions and planning | 104 | | Operation of mechanisms | 80 | | Sequence solving | 77 | | Math problems | 51 | | Functions and Algorithms | 50 | | Lateral thinking | 44 | | Classic riddles | 41 | | Logical traps | 16 | ### Correct answer choice distribution | correct_option_number | Count | |-----------------------|-------| | 2 | 201 | | 3 | 164 | | 1 | 143 | | 5 | 140 | | 4 | 53 | ### Authors Sergei Bratchikov (Tochka Bank) - @hivaze
hivaze/LOGIC-701
[ "size_categories:n<1K", "language:en", "language:ru", "license:apache-2.0", "benchmark", "logic", "instruct", "reasoning", "region:us" ]
2024-02-02T01:07:36+00:00
{"language": ["en", "ru"], "license": "apache-2.0", "size_categories": ["n<1K"], "dataset_info": [{"config_name": "en", "features": [{"name": "topic", "dtype": "string"}, {"name": "problem_statement", "dtype": "string"}, {"name": "solution", "dtype": "string"}, {"name": "answer_option_1", "dtype": "string"}, {"name": "answer_option_2", "dtype": "string"}, {"name": "answer_option_3", "dtype": "string"}, {"name": "answer_option_4", "dtype": "string"}, {"name": "answer_option_5", "dtype": "string"}, {"name": "correct_option_number", "dtype": "int64"}], "splits": [{"name": "train", "num_bytes": 687953, "num_examples": 701}], "download_size": 372910, "dataset_size": 687953}, {"config_name": "ru", "features": [{"name": "topic", "dtype": "string"}, {"name": "problem_statement", "dtype": "string"}, {"name": "solution", "dtype": "string"}, {"name": "answer_option_1", "dtype": "string"}, {"name": "answer_option_2", "dtype": "string"}, {"name": "answer_option_3", "dtype": "string"}, {"name": "answer_option_4", "dtype": "string"}, {"name": "answer_option_5", "dtype": "string"}, {"name": "correct_option_number", "dtype": "int64"}], "splits": [{"name": "train", "num_bytes": 1211167, "num_examples": 701}], "download_size": 559700, "dataset_size": 1211167}], "configs": [{"config_name": "en", "data_files": [{"split": "train", "path": "en/train-*"}]}, {"config_name": "ru", "data_files": [{"split": "train", "path": "ru/train-*"}]}], "tags": ["benchmark", "logic", "instruct", "reasoning"]}
2024-02-03T06:18:18+00:00
[]
[ "en", "ru" ]
TAGS #size_categories-n<1K #language-English #language-Russian #license-apache-2.0 #benchmark #logic #instruct #reasoning #region-us
LOGIC-701 Benchmark ------------------- This is a synthetic and filtered dataset for benchmarking large language models (LLMs). It consists of 701 medium and hard logic puzzles with solutions on 10 distinct topics. A feature of the dataset is that it tests exclusively logical/reasoning abilities, offering only 5 answer options. There are no or very few tasks in the dataset that require external knowledge about events, people, facts, etc. ### Languages This benchmark is also part of an initiative to evaluate the ability of models to think in different languages. So it was generated in English and translated into Russian, preserving the meaning as much as possible. ### Creation The dataset was created in 5 stages: 1. Generating a large number of tasks with answer choices on given topics in English using gpt-4-1106-preview 2. Deduplication and condition filtering using intfloat/e5-large-v2 3. 4 independent runs of gpt-4-1106-preview were conducted to determine the best answer for each task 4. All tasks for which gpt-4-1106-preview diverged in answers from 2 times in 4 independent generations were removed 5. Translation of the problems into Russian, also using gpt-4-1106-preview, to create a parallel corpus ### Topics of logic puzzles ### Correct answer choice distribution ### Authors Sergei Bratchikov (Tochka Bank) - @hivaze
[ "### Languages\n\n\nThis benchmark is also part of an initiative to evaluate the ability of models to think in different languages. \n\nSo it was generated in English and translated into Russian, preserving the meaning as much as possible.", "### Creation\n\n\nThe dataset was created in 5 stages:\n\n\n1. Generating a large number of tasks with answer choices on given topics in English using gpt-4-1106-preview\n2. Deduplication and condition filtering using intfloat/e5-large-v2\n3. 4 independent runs of gpt-4-1106-preview were conducted to determine the best answer for each task\n4. All tasks for which gpt-4-1106-preview diverged in answers from 2 times in 4 independent generations were removed\n5. Translation of the problems into Russian, also using gpt-4-1106-preview, to create a parallel corpus", "### Topics of logic puzzles", "### Correct answer choice distribution", "### Authors\n\n\nSergei Bratchikov (Tochka Bank) - @hivaze" ]
[ "TAGS\n#size_categories-n<1K #language-English #language-Russian #license-apache-2.0 #benchmark #logic #instruct #reasoning #region-us \n", "### Languages\n\n\nThis benchmark is also part of an initiative to evaluate the ability of models to think in different languages. \n\nSo it was generated in English and translated into Russian, preserving the meaning as much as possible.", "### Creation\n\n\nThe dataset was created in 5 stages:\n\n\n1. Generating a large number of tasks with answer choices on given topics in English using gpt-4-1106-preview\n2. Deduplication and condition filtering using intfloat/e5-large-v2\n3. 4 independent runs of gpt-4-1106-preview were conducted to determine the best answer for each task\n4. All tasks for which gpt-4-1106-preview diverged in answers from 2 times in 4 independent generations were removed\n5. Translation of the problems into Russian, also using gpt-4-1106-preview, to create a parallel corpus", "### Topics of logic puzzles", "### Correct answer choice distribution", "### Authors\n\n\nSergei Bratchikov (Tochka Bank) - @hivaze" ]
8288586ba7daccc2d46b9916d529f66b2b148fd2
# Dataset Card for Evaluation run of jpechg/Sour-Marcoro-12.5B <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [jpechg/Sour-Marcoro-12.5B](https://huggingface.co/jpechg/Sour-Marcoro-12.5B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_jpechg__Sour-Marcoro-12.5B", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-02-02T01:13:13.191577](https://huggingface.co/datasets/open-llm-leaderboard/details_jpechg__Sour-Marcoro-12.5B/blob/main/results_2024-02-02T01-13-13.191577.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6581308018783164, "acc_stderr": 0.03193252324169342, "acc_norm": 0.6618395212033175, "acc_norm_stderr": 0.03257776906432054, "mc1": 0.5397796817625459, "mc1_stderr": 0.017448017223960877, "mc2": 0.6816993058639923, "mc2_stderr": 0.015465736469164977 }, "harness|arc:challenge|25": { "acc": 0.659556313993174, "acc_stderr": 0.013847460518892978, "acc_norm": 0.6791808873720137, "acc_norm_stderr": 0.013640943091946526 }, "harness|hellaswag|10": { "acc": 0.6563433578968333, "acc_stderr": 0.004739575380508865, "acc_norm": 0.8369846644094802, "acc_norm_stderr": 0.0036862475593618374 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.37, "acc_stderr": 0.048523658709391, "acc_norm": 0.37, "acc_norm_stderr": 0.048523658709391 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.6148148148148148, "acc_stderr": 0.04203921040156279, "acc_norm": 0.6148148148148148, "acc_norm_stderr": 0.04203921040156279 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.756578947368421, "acc_stderr": 0.034923496688842384, "acc_norm": 0.756578947368421, "acc_norm_stderr": 0.034923496688842384 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.7, "acc_stderr": 0.04605661864718381, "acc_norm": 0.7, "acc_norm_stderr": 0.04605661864718381 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.6981132075471698, "acc_stderr": 0.02825420034443866, "acc_norm": 0.6981132075471698, "acc_norm_stderr": 0.02825420034443866 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.7777777777777778, "acc_stderr": 0.03476590104304134, "acc_norm": 0.7777777777777778, "acc_norm_stderr": 0.03476590104304134 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.42, "acc_stderr": 0.049604496374885836, "acc_norm": 0.42, "acc_norm_stderr": 0.049604496374885836 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.56, "acc_stderr": 0.049888765156985884, "acc_norm": 0.56, "acc_norm_stderr": 0.049888765156985884 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.38, "acc_stderr": 0.048783173121456316, "acc_norm": 0.38, "acc_norm_stderr": 0.048783173121456316 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6763005780346821, "acc_stderr": 0.03567603799639171, "acc_norm": 0.6763005780346821, "acc_norm_stderr": 0.03567603799639171 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.3431372549019608, "acc_stderr": 0.04724007352383887, "acc_norm": 0.3431372549019608, "acc_norm_stderr": 0.04724007352383887 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.76, "acc_stderr": 0.042923469599092816, "acc_norm": 0.76, "acc_norm_stderr": 0.042923469599092816 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.6595744680851063, "acc_stderr": 0.030976692998534443, "acc_norm": 0.6595744680851063, "acc_norm_stderr": 0.030976692998534443 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.5263157894736842, "acc_stderr": 0.046970851366478626, "acc_norm": 0.5263157894736842, "acc_norm_stderr": 0.046970851366478626 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.6413793103448275, "acc_stderr": 0.039966295748767186, "acc_norm": 0.6413793103448275, "acc_norm_stderr": 0.039966295748767186 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.46825396825396826, "acc_stderr": 0.0256993528321318, "acc_norm": 0.46825396825396826, "acc_norm_stderr": 0.0256993528321318 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.4444444444444444, "acc_stderr": 0.04444444444444449, "acc_norm": 0.4444444444444444, "acc_norm_stderr": 0.04444444444444449 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.37, "acc_stderr": 0.048523658709391, "acc_norm": 0.37, "acc_norm_stderr": 0.048523658709391 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.7967741935483871, "acc_stderr": 0.022891687984554952, "acc_norm": 0.7967741935483871, "acc_norm_stderr": 0.022891687984554952 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.5221674876847291, "acc_stderr": 0.03514528562175007, "acc_norm": 0.5221674876847291, "acc_norm_stderr": 0.03514528562175007 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.73, "acc_stderr": 0.044619604333847394, "acc_norm": 0.73, "acc_norm_stderr": 0.044619604333847394 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7757575757575758, "acc_stderr": 0.03256866661681102, "acc_norm": 0.7757575757575758, "acc_norm_stderr": 0.03256866661681102 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.8383838383838383, "acc_stderr": 0.026225919863629283, "acc_norm": 0.8383838383838383, "acc_norm_stderr": 0.026225919863629283 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.8963730569948186, "acc_stderr": 0.021995311963644244, "acc_norm": 0.8963730569948186, "acc_norm_stderr": 0.021995311963644244 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.6923076923076923, "acc_stderr": 0.0234009289183105, "acc_norm": 0.6923076923076923, "acc_norm_stderr": 0.0234009289183105 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.3148148148148148, "acc_stderr": 0.028317533496066482, "acc_norm": 0.3148148148148148, "acc_norm_stderr": 0.028317533496066482 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.7226890756302521, "acc_stderr": 0.02907937453948001, "acc_norm": 0.7226890756302521, "acc_norm_stderr": 0.02907937453948001 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.3973509933774834, "acc_stderr": 0.03995524007681681, "acc_norm": 0.3973509933774834, "acc_norm_stderr": 0.03995524007681681 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8330275229357799, "acc_stderr": 0.01599015488507334, "acc_norm": 0.8330275229357799, "acc_norm_stderr": 0.01599015488507334 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.5509259259259259, "acc_stderr": 0.033922384053216174, "acc_norm": 0.5509259259259259, "acc_norm_stderr": 0.033922384053216174 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.8137254901960784, "acc_stderr": 0.027325470966716312, "acc_norm": 0.8137254901960784, "acc_norm_stderr": 0.027325470966716312 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.8523206751054853, "acc_stderr": 0.0230943295825957, "acc_norm": 0.8523206751054853, "acc_norm_stderr": 0.0230943295825957 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6681614349775785, "acc_stderr": 0.031602951437766785, "acc_norm": 0.6681614349775785, "acc_norm_stderr": 0.031602951437766785 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.7633587786259542, "acc_stderr": 0.03727673575596915, "acc_norm": 0.7633587786259542, "acc_norm_stderr": 0.03727673575596915 }, "harness|hendrycksTest-international_law|5": { "acc": 0.7851239669421488, "acc_stderr": 0.03749492448709696, "acc_norm": 0.7851239669421488, "acc_norm_stderr": 0.03749492448709696 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7962962962962963, "acc_stderr": 0.03893542518824847, "acc_norm": 0.7962962962962963, "acc_norm_stderr": 0.03893542518824847 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7177914110429447, "acc_stderr": 0.03536117886664742, "acc_norm": 0.7177914110429447, "acc_norm_stderr": 0.03536117886664742 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.49107142857142855, "acc_stderr": 0.04745033255489123, "acc_norm": 0.49107142857142855, "acc_norm_stderr": 0.04745033255489123 }, "harness|hendrycksTest-management|5": { "acc": 0.8446601941747572, "acc_stderr": 0.03586594738573973, "acc_norm": 0.8446601941747572, "acc_norm_stderr": 0.03586594738573973 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8418803418803419, "acc_stderr": 0.023902325549560406, "acc_norm": 0.8418803418803419, "acc_norm_stderr": 0.023902325549560406 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.68, "acc_stderr": 0.046882617226215034, "acc_norm": 0.68, "acc_norm_stderr": 0.046882617226215034 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8071519795657727, "acc_stderr": 0.014108533515757431, "acc_norm": 0.8071519795657727, "acc_norm_stderr": 0.014108533515757431 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.7341040462427746, "acc_stderr": 0.023786203255508287, "acc_norm": 0.7341040462427746, "acc_norm_stderr": 0.023786203255508287 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.41899441340782123, "acc_stderr": 0.016501579306861677, "acc_norm": 0.41899441340782123, "acc_norm_stderr": 0.016501579306861677 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.7222222222222222, "acc_stderr": 0.025646863097137894, "acc_norm": 0.7222222222222222, "acc_norm_stderr": 0.025646863097137894 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.7170418006430869, "acc_stderr": 0.025583062489984824, "acc_norm": 0.7170418006430869, "acc_norm_stderr": 0.025583062489984824 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.7932098765432098, "acc_stderr": 0.022535006705942842, "acc_norm": 0.7932098765432098, "acc_norm_stderr": 0.022535006705942842 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.5177304964539007, "acc_stderr": 0.02980873964223777, "acc_norm": 0.5177304964539007, "acc_norm_stderr": 0.02980873964223777 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.4810951760104302, "acc_stderr": 0.012761104871472655, "acc_norm": 0.4810951760104302, "acc_norm_stderr": 0.012761104871472655 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.7242647058823529, "acc_stderr": 0.027146271936625162, "acc_norm": 0.7242647058823529, "acc_norm_stderr": 0.027146271936625162 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6650326797385621, "acc_stderr": 0.01909422816700031, "acc_norm": 0.6650326797385621, "acc_norm_stderr": 0.01909422816700031 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6909090909090909, "acc_stderr": 0.044262946482000985, "acc_norm": 0.6909090909090909, "acc_norm_stderr": 0.044262946482000985 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.7061224489795919, "acc_stderr": 0.02916273841024977, "acc_norm": 0.7061224489795919, "acc_norm_stderr": 0.02916273841024977 }, "harness|hendrycksTest-sociology|5": { "acc": 0.8208955223880597, "acc_stderr": 0.027113286753111837, "acc_norm": 0.8208955223880597, "acc_norm_stderr": 0.027113286753111837 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.88, "acc_stderr": 0.032659863237109066, "acc_norm": 0.88, "acc_norm_stderr": 0.032659863237109066 }, "harness|hendrycksTest-virology|5": { "acc": 0.5783132530120482, "acc_stderr": 0.038444531817709175, "acc_norm": 0.5783132530120482, "acc_norm_stderr": 0.038444531817709175 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.783625730994152, "acc_stderr": 0.031581495393387324, "acc_norm": 0.783625730994152, "acc_norm_stderr": 0.031581495393387324 }, "harness|truthfulqa:mc|0": { "mc1": 0.5397796817625459, "mc1_stderr": 0.017448017223960877, "mc2": 0.6816993058639923, "mc2_stderr": 0.015465736469164977 }, "harness|winogrande|5": { "acc": 0.8208366219415943, "acc_stderr": 0.010777949156047989 }, "harness|gsm8k|5": { "acc": 0.47687642153146326, "acc_stderr": 0.013757748544245326 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_jpechg__Sour-Marcoro-12.5B
[ "region:us" ]
2024-02-02T01:15:33+00:00
{"pretty_name": "Evaluation run of jpechg/Sour-Marcoro-12.5B", "dataset_summary": "Dataset automatically created during the evaluation run of model [jpechg/Sour-Marcoro-12.5B](https://huggingface.co/jpechg/Sour-Marcoro-12.5B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_jpechg__Sour-Marcoro-12.5B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-02T01:13:13.191577](https://huggingface.co/datasets/open-llm-leaderboard/details_jpechg__Sour-Marcoro-12.5B/blob/main/results_2024-02-02T01-13-13.191577.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6581308018783164,\n \"acc_stderr\": 0.03193252324169342,\n \"acc_norm\": 0.6618395212033175,\n \"acc_norm_stderr\": 0.03257776906432054,\n \"mc1\": 0.5397796817625459,\n \"mc1_stderr\": 0.017448017223960877,\n \"mc2\": 0.6816993058639923,\n \"mc2_stderr\": 0.015465736469164977\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.659556313993174,\n \"acc_stderr\": 0.013847460518892978,\n \"acc_norm\": 0.6791808873720137,\n \"acc_norm_stderr\": 0.013640943091946526\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6563433578968333,\n \"acc_stderr\": 0.004739575380508865,\n \"acc_norm\": 0.8369846644094802,\n \"acc_norm_stderr\": 0.0036862475593618374\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6148148148148148,\n \"acc_stderr\": 0.04203921040156279,\n \"acc_norm\": 0.6148148148148148,\n \"acc_norm_stderr\": 0.04203921040156279\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.756578947368421,\n \"acc_stderr\": 0.034923496688842384,\n \"acc_norm\": 0.756578947368421,\n \"acc_norm_stderr\": 0.034923496688842384\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.04605661864718381,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.04605661864718381\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6981132075471698,\n \"acc_stderr\": 0.02825420034443866,\n \"acc_norm\": 0.6981132075471698,\n \"acc_norm_stderr\": 0.02825420034443866\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.03476590104304134,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.03476590104304134\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.56,\n \"acc_stderr\": 0.049888765156985884,\n \"acc_norm\": 0.56,\n \"acc_norm_stderr\": 0.049888765156985884\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6763005780346821,\n \"acc_stderr\": 0.03567603799639171,\n \"acc_norm\": 0.6763005780346821,\n \"acc_norm_stderr\": 0.03567603799639171\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.3431372549019608,\n \"acc_stderr\": 0.04724007352383887,\n \"acc_norm\": 0.3431372549019608,\n \"acc_norm_stderr\": 0.04724007352383887\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.6595744680851063,\n \"acc_stderr\": 0.030976692998534443,\n \"acc_norm\": 0.6595744680851063,\n \"acc_norm_stderr\": 0.030976692998534443\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5263157894736842,\n \"acc_stderr\": 0.046970851366478626,\n \"acc_norm\": 0.5263157894736842,\n \"acc_norm_stderr\": 0.046970851366478626\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.6413793103448275,\n \"acc_stderr\": 0.039966295748767186,\n \"acc_norm\": 0.6413793103448275,\n \"acc_norm_stderr\": 0.039966295748767186\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.46825396825396826,\n \"acc_stderr\": 0.0256993528321318,\n \"acc_norm\": 0.46825396825396826,\n \"acc_norm_stderr\": 0.0256993528321318\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4444444444444444,\n \"acc_stderr\": 0.04444444444444449,\n \"acc_norm\": 0.4444444444444444,\n \"acc_norm_stderr\": 0.04444444444444449\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7967741935483871,\n \"acc_stderr\": 0.022891687984554952,\n \"acc_norm\": 0.7967741935483871,\n \"acc_norm_stderr\": 0.022891687984554952\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5221674876847291,\n \"acc_stderr\": 0.03514528562175007,\n \"acc_norm\": 0.5221674876847291,\n \"acc_norm_stderr\": 0.03514528562175007\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.03256866661681102,\n \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.03256866661681102\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.8383838383838383,\n \"acc_stderr\": 0.026225919863629283,\n \"acc_norm\": 0.8383838383838383,\n \"acc_norm_stderr\": 0.026225919863629283\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8963730569948186,\n \"acc_stderr\": 0.021995311963644244,\n \"acc_norm\": 0.8963730569948186,\n \"acc_norm_stderr\": 0.021995311963644244\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6923076923076923,\n \"acc_stderr\": 0.0234009289183105,\n \"acc_norm\": 0.6923076923076923,\n \"acc_norm_stderr\": 0.0234009289183105\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3148148148148148,\n \"acc_stderr\": 0.028317533496066482,\n \"acc_norm\": 0.3148148148148148,\n \"acc_norm_stderr\": 0.028317533496066482\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.7226890756302521,\n \"acc_stderr\": 0.02907937453948001,\n \"acc_norm\": 0.7226890756302521,\n \"acc_norm_stderr\": 0.02907937453948001\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3973509933774834,\n \"acc_stderr\": 0.03995524007681681,\n \"acc_norm\": 0.3973509933774834,\n \"acc_norm_stderr\": 0.03995524007681681\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8330275229357799,\n \"acc_stderr\": 0.01599015488507334,\n \"acc_norm\": 0.8330275229357799,\n \"acc_norm_stderr\": 0.01599015488507334\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5509259259259259,\n \"acc_stderr\": 0.033922384053216174,\n \"acc_norm\": 0.5509259259259259,\n \"acc_norm_stderr\": 0.033922384053216174\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8137254901960784,\n \"acc_stderr\": 0.027325470966716312,\n \"acc_norm\": 0.8137254901960784,\n \"acc_norm_stderr\": 0.027325470966716312\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8523206751054853,\n \"acc_stderr\": 0.0230943295825957,\n \"acc_norm\": 0.8523206751054853,\n \"acc_norm_stderr\": 0.0230943295825957\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6681614349775785,\n \"acc_stderr\": 0.031602951437766785,\n \"acc_norm\": 0.6681614349775785,\n \"acc_norm_stderr\": 0.031602951437766785\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7633587786259542,\n \"acc_stderr\": 0.03727673575596915,\n \"acc_norm\": 0.7633587786259542,\n \"acc_norm_stderr\": 0.03727673575596915\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7851239669421488,\n \"acc_stderr\": 0.03749492448709696,\n \"acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.03749492448709696\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7962962962962963,\n \"acc_stderr\": 0.03893542518824847,\n \"acc_norm\": 0.7962962962962963,\n \"acc_norm_stderr\": 0.03893542518824847\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7177914110429447,\n \"acc_stderr\": 0.03536117886664742,\n \"acc_norm\": 0.7177914110429447,\n \"acc_norm_stderr\": 0.03536117886664742\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.49107142857142855,\n \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.49107142857142855,\n \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8446601941747572,\n \"acc_stderr\": 0.03586594738573973,\n \"acc_norm\": 0.8446601941747572,\n \"acc_norm_stderr\": 0.03586594738573973\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8418803418803419,\n \"acc_stderr\": 0.023902325549560406,\n \"acc_norm\": 0.8418803418803419,\n \"acc_norm_stderr\": 0.023902325549560406\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.68,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8071519795657727,\n \"acc_stderr\": 0.014108533515757431,\n \"acc_norm\": 0.8071519795657727,\n \"acc_norm_stderr\": 0.014108533515757431\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7341040462427746,\n \"acc_stderr\": 0.023786203255508287,\n \"acc_norm\": 0.7341040462427746,\n \"acc_norm_stderr\": 0.023786203255508287\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.41899441340782123,\n \"acc_stderr\": 0.016501579306861677,\n \"acc_norm\": 0.41899441340782123,\n \"acc_norm_stderr\": 0.016501579306861677\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.025646863097137894,\n \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.025646863097137894\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7170418006430869,\n \"acc_stderr\": 0.025583062489984824,\n \"acc_norm\": 0.7170418006430869,\n \"acc_norm_stderr\": 0.025583062489984824\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7932098765432098,\n \"acc_stderr\": 0.022535006705942842,\n \"acc_norm\": 0.7932098765432098,\n \"acc_norm_stderr\": 0.022535006705942842\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.5177304964539007,\n \"acc_stderr\": 0.02980873964223777,\n \"acc_norm\": 0.5177304964539007,\n \"acc_norm_stderr\": 0.02980873964223777\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4810951760104302,\n \"acc_stderr\": 0.012761104871472655,\n \"acc_norm\": 0.4810951760104302,\n \"acc_norm_stderr\": 0.012761104871472655\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.7242647058823529,\n \"acc_stderr\": 0.027146271936625162,\n \"acc_norm\": 0.7242647058823529,\n \"acc_norm_stderr\": 0.027146271936625162\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6650326797385621,\n \"acc_stderr\": 0.01909422816700031,\n \"acc_norm\": 0.6650326797385621,\n \"acc_norm_stderr\": 0.01909422816700031\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6909090909090909,\n \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.6909090909090909,\n \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7061224489795919,\n \"acc_stderr\": 0.02916273841024977,\n \"acc_norm\": 0.7061224489795919,\n \"acc_norm_stderr\": 0.02916273841024977\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8208955223880597,\n \"acc_stderr\": 0.027113286753111837,\n \"acc_norm\": 0.8208955223880597,\n \"acc_norm_stderr\": 0.027113286753111837\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.88,\n \"acc_stderr\": 0.032659863237109066,\n \"acc_norm\": 0.88,\n \"acc_norm_stderr\": 0.032659863237109066\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5783132530120482,\n \"acc_stderr\": 0.038444531817709175,\n \"acc_norm\": 0.5783132530120482,\n \"acc_norm_stderr\": 0.038444531817709175\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.783625730994152,\n \"acc_stderr\": 0.031581495393387324,\n \"acc_norm\": 0.783625730994152,\n \"acc_norm_stderr\": 0.031581495393387324\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5397796817625459,\n \"mc1_stderr\": 0.017448017223960877,\n \"mc2\": 0.6816993058639923,\n \"mc2_stderr\": 0.015465736469164977\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8208366219415943,\n \"acc_stderr\": 0.010777949156047989\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.47687642153146326,\n \"acc_stderr\": 0.013757748544245326\n }\n}\n```", "repo_url": "https://huggingface.co/jpechg/Sour-Marcoro-12.5B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_02T01_13_13.191577", "path": ["**/details_harness|arc:challenge|25_2024-02-02T01-13-13.191577.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-02T01-13-13.191577.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_02T01_13_13.191577", "path": ["**/details_harness|gsm8k|5_2024-02-02T01-13-13.191577.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-02T01-13-13.191577.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_02T01_13_13.191577", "path": ["**/details_harness|hellaswag|10_2024-02-02T01-13-13.191577.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-02T01-13-13.191577.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_02T01_13_13.191577", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T01-13-13.191577.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-02T01-13-13.191577.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-02T01-13-13.191577.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T01-13-13.191577.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T01-13-13.191577.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-02T01-13-13.191577.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T01-13-13.191577.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T01-13-13.191577.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T01-13-13.191577.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T01-13-13.191577.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-02T01-13-13.191577.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-02T01-13-13.191577.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T01-13-13.191577.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-02T01-13-13.191577.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T01-13-13.191577.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T01-13-13.191577.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T01-13-13.191577.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-02T01-13-13.191577.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T01-13-13.191577.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T01-13-13.191577.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T01-13-13.191577.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T01-13-13.191577.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T01-13-13.191577.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T01-13-13.191577.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T01-13-13.191577.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T01-13-13.191577.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T01-13-13.191577.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T01-13-13.191577.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T01-13-13.191577.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T01-13-13.191577.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T01-13-13.191577.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T01-13-13.191577.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-02T01-13-13.191577.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T01-13-13.191577.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-02T01-13-13.191577.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T01-13-13.191577.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T01-13-13.191577.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T01-13-13.191577.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-02T01-13-13.191577.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-02T01-13-13.191577.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T01-13-13.191577.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T01-13-13.191577.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T01-13-13.191577.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T01-13-13.191577.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-02T01-13-13.191577.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-02T01-13-13.191577.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-02T01-13-13.191577.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T01-13-13.191577.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-02T01-13-13.191577.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T01-13-13.191577.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T01-13-13.191577.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-02T01-13-13.191577.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-02T01-13-13.191577.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-02T01-13-13.191577.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T01-13-13.191577.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-02T01-13-13.191577.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-02T01-13-13.191577.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T01-13-13.191577.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-02T01-13-13.191577.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-02T01-13-13.191577.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T01-13-13.191577.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T01-13-13.191577.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-02T01-13-13.191577.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T01-13-13.191577.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T01-13-13.191577.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T01-13-13.191577.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T01-13-13.191577.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-02T01-13-13.191577.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-02T01-13-13.191577.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T01-13-13.191577.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-02T01-13-13.191577.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T01-13-13.191577.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T01-13-13.191577.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T01-13-13.191577.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-02T01-13-13.191577.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T01-13-13.191577.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T01-13-13.191577.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T01-13-13.191577.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T01-13-13.191577.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T01-13-13.191577.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T01-13-13.191577.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T01-13-13.191577.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T01-13-13.191577.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T01-13-13.191577.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T01-13-13.191577.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T01-13-13.191577.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T01-13-13.191577.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T01-13-13.191577.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T01-13-13.191577.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-02T01-13-13.191577.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T01-13-13.191577.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-02T01-13-13.191577.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T01-13-13.191577.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T01-13-13.191577.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T01-13-13.191577.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-02T01-13-13.191577.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-02T01-13-13.191577.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T01-13-13.191577.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T01-13-13.191577.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T01-13-13.191577.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T01-13-13.191577.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-02T01-13-13.191577.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-02T01-13-13.191577.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-02T01-13-13.191577.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T01-13-13.191577.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-02T01-13-13.191577.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T01-13-13.191577.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T01-13-13.191577.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-02T01-13-13.191577.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-02T01-13-13.191577.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-02T01-13-13.191577.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T01-13-13.191577.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-02T01-13-13.191577.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-02T01-13-13.191577.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_02T01_13_13.191577", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T01-13-13.191577.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T01-13-13.191577.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_02T01_13_13.191577", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-02T01-13-13.191577.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-02T01-13-13.191577.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_02T01_13_13.191577", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-02T01-13-13.191577.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-02T01-13-13.191577.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_02T01_13_13.191577", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T01-13-13.191577.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T01-13-13.191577.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_02T01_13_13.191577", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T01-13-13.191577.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T01-13-13.191577.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_02T01_13_13.191577", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-02T01-13-13.191577.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-02T01-13-13.191577.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_02T01_13_13.191577", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T01-13-13.191577.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T01-13-13.191577.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_02T01_13_13.191577", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T01-13-13.191577.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T01-13-13.191577.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_02T01_13_13.191577", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T01-13-13.191577.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T01-13-13.191577.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_02T01_13_13.191577", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T01-13-13.191577.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T01-13-13.191577.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_02T01_13_13.191577", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-02T01-13-13.191577.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-02T01-13-13.191577.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_02T01_13_13.191577", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-02T01-13-13.191577.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-02T01-13-13.191577.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_02T01_13_13.191577", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T01-13-13.191577.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T01-13-13.191577.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_02T01_13_13.191577", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-02T01-13-13.191577.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-02T01-13-13.191577.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_02T01_13_13.191577", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T01-13-13.191577.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T01-13-13.191577.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_02T01_13_13.191577", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T01-13-13.191577.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T01-13-13.191577.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_02T01_13_13.191577", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T01-13-13.191577.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T01-13-13.191577.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_02T01_13_13.191577", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-02T01-13-13.191577.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-02T01-13-13.191577.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_02T01_13_13.191577", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T01-13-13.191577.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T01-13-13.191577.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_02T01_13_13.191577", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T01-13-13.191577.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T01-13-13.191577.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_02T01_13_13.191577", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T01-13-13.191577.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T01-13-13.191577.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_02T01_13_13.191577", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T01-13-13.191577.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T01-13-13.191577.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_02T01_13_13.191577", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T01-13-13.191577.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T01-13-13.191577.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_02T01_13_13.191577", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T01-13-13.191577.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T01-13-13.191577.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_02T01_13_13.191577", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T01-13-13.191577.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T01-13-13.191577.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_02T01_13_13.191577", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T01-13-13.191577.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T01-13-13.191577.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_02T01_13_13.191577", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T01-13-13.191577.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T01-13-13.191577.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_02T01_13_13.191577", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T01-13-13.191577.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T01-13-13.191577.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_02T01_13_13.191577", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T01-13-13.191577.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T01-13-13.191577.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_02T01_13_13.191577", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T01-13-13.191577.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T01-13-13.191577.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_02T01_13_13.191577", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T01-13-13.191577.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T01-13-13.191577.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_02T01_13_13.191577", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T01-13-13.191577.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T01-13-13.191577.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_02T01_13_13.191577", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-02T01-13-13.191577.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-02T01-13-13.191577.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_02T01_13_13.191577", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T01-13-13.191577.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T01-13-13.191577.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_02T01_13_13.191577", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-02T01-13-13.191577.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-02T01-13-13.191577.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_02T01_13_13.191577", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T01-13-13.191577.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T01-13-13.191577.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_02T01_13_13.191577", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T01-13-13.191577.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T01-13-13.191577.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_02T01_13_13.191577", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T01-13-13.191577.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T01-13-13.191577.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_02T01_13_13.191577", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-02T01-13-13.191577.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-02T01-13-13.191577.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_02T01_13_13.191577", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-02T01-13-13.191577.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-02T01-13-13.191577.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_02T01_13_13.191577", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T01-13-13.191577.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T01-13-13.191577.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_02T01_13_13.191577", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T01-13-13.191577.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T01-13-13.191577.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_02T01_13_13.191577", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T01-13-13.191577.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T01-13-13.191577.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_02T01_13_13.191577", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T01-13-13.191577.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T01-13-13.191577.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_02T01_13_13.191577", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-02T01-13-13.191577.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-02T01-13-13.191577.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_02T01_13_13.191577", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-02T01-13-13.191577.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-02T01-13-13.191577.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_02T01_13_13.191577", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-02T01-13-13.191577.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-02T01-13-13.191577.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_02T01_13_13.191577", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T01-13-13.191577.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T01-13-13.191577.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_02T01_13_13.191577", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-02T01-13-13.191577.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-02T01-13-13.191577.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_02T01_13_13.191577", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T01-13-13.191577.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T01-13-13.191577.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_02T01_13_13.191577", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T01-13-13.191577.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T01-13-13.191577.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_02T01_13_13.191577", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-02T01-13-13.191577.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-02T01-13-13.191577.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_02T01_13_13.191577", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-02T01-13-13.191577.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-02T01-13-13.191577.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_02T01_13_13.191577", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-02T01-13-13.191577.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-02T01-13-13.191577.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_02T01_13_13.191577", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T01-13-13.191577.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T01-13-13.191577.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_02T01_13_13.191577", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-02T01-13-13.191577.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-02T01-13-13.191577.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_02T01_13_13.191577", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-02T01-13-13.191577.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-02T01-13-13.191577.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_02T01_13_13.191577", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-02T01-13-13.191577.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-02T01-13-13.191577.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_02T01_13_13.191577", "path": ["**/details_harness|winogrande|5_2024-02-02T01-13-13.191577.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-02T01-13-13.191577.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_02T01_13_13.191577", "path": ["results_2024-02-02T01-13-13.191577.parquet"]}, {"split": "latest", "path": ["results_2024-02-02T01-13-13.191577.parquet"]}]}]}
2024-02-02T01:17:08+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of jpechg/Sour-Marcoro-12.5B Dataset automatically created during the evaluation run of model jpechg/Sour-Marcoro-12.5B on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-02-02T01:13:13.191577(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of jpechg/Sour-Marcoro-12.5B\n\n\n\nDataset automatically created during the evaluation run of model jpechg/Sour-Marcoro-12.5B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-02-02T01:13:13.191577(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of jpechg/Sour-Marcoro-12.5B\n\n\n\nDataset automatically created during the evaluation run of model jpechg/Sour-Marcoro-12.5B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-02-02T01:13:13.191577(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
5ad23c684ea548250d8717a05be14bd27653191d
# Dataset Card for Evaluation run of Stopwolf/DistilabelCerberus-7B-slerp <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [Stopwolf/DistilabelCerberus-7B-slerp](https://huggingface.co/Stopwolf/DistilabelCerberus-7B-slerp) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_Stopwolf__DistilabelCerberus-7B-slerp", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-02-02T01:28:41.378025](https://huggingface.co/datasets/open-llm-leaderboard/details_Stopwolf__DistilabelCerberus-7B-slerp/blob/main/results_2024-02-02T01-28-41.378025.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6464333239348551, "acc_stderr": 0.032147073947899604, "acc_norm": 0.6464670335536932, "acc_norm_stderr": 0.03280457322475929, "mc1": 0.4455324357405141, "mc1_stderr": 0.017399335280140354, "mc2": 0.609312831167026, "mc2_stderr": 0.015494903078684579 }, "harness|arc:challenge|25": { "acc": 0.6544368600682594, "acc_stderr": 0.013896938461145687, "acc_norm": 0.681740614334471, "acc_norm_stderr": 0.013611993916971453 }, "harness|hellaswag|10": { "acc": 0.6928898625771759, "acc_stderr": 0.004603527017557838, "acc_norm": 0.867755427205736, "acc_norm_stderr": 0.003380641470989925 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.34, "acc_stderr": 0.04760952285695235, "acc_norm": 0.34, "acc_norm_stderr": 0.04760952285695235 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.562962962962963, "acc_stderr": 0.04284958639753401, "acc_norm": 0.562962962962963, "acc_norm_stderr": 0.04284958639753401 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.7039473684210527, "acc_stderr": 0.03715062154998904, "acc_norm": 0.7039473684210527, "acc_norm_stderr": 0.03715062154998904 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.62, "acc_stderr": 0.048783173121456316, "acc_norm": 0.62, "acc_norm_stderr": 0.048783173121456316 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.6792452830188679, "acc_stderr": 0.028727502957880267, "acc_norm": 0.6792452830188679, "acc_norm_stderr": 0.028727502957880267 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.7638888888888888, "acc_stderr": 0.03551446610810826, "acc_norm": 0.7638888888888888, "acc_norm_stderr": 0.03551446610810826 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.46, "acc_stderr": 0.05009082659620333, "acc_norm": 0.46, "acc_norm_stderr": 0.05009082659620333 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.49, "acc_stderr": 0.05024183937956912, "acc_norm": 0.49, "acc_norm_stderr": 0.05024183937956912 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.3, "acc_stderr": 0.046056618647183814, "acc_norm": 0.3, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.630057803468208, "acc_stderr": 0.036812296333943194, "acc_norm": 0.630057803468208, "acc_norm_stderr": 0.036812296333943194 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.4019607843137255, "acc_stderr": 0.04878608714466996, "acc_norm": 0.4019607843137255, "acc_norm_stderr": 0.04878608714466996 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.75, "acc_stderr": 0.04351941398892446, "acc_norm": 0.75, "acc_norm_stderr": 0.04351941398892446 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.5617021276595745, "acc_stderr": 0.03243618636108101, "acc_norm": 0.5617021276595745, "acc_norm_stderr": 0.03243618636108101 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.5, "acc_stderr": 0.047036043419179864, "acc_norm": 0.5, "acc_norm_stderr": 0.047036043419179864 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5379310344827586, "acc_stderr": 0.04154659671707548, "acc_norm": 0.5379310344827586, "acc_norm_stderr": 0.04154659671707548 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.41798941798941797, "acc_stderr": 0.025402555503260912, "acc_norm": 0.41798941798941797, "acc_norm_stderr": 0.025402555503260912 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.4603174603174603, "acc_stderr": 0.04458029125470973, "acc_norm": 0.4603174603174603, "acc_norm_stderr": 0.04458029125470973 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.31, "acc_stderr": 0.04648231987117316, "acc_norm": 0.31, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.7935483870967742, "acc_stderr": 0.023025899617188723, "acc_norm": 0.7935483870967742, "acc_norm_stderr": 0.023025899617188723 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.5024630541871922, "acc_stderr": 0.035179450386910616, "acc_norm": 0.5024630541871922, "acc_norm_stderr": 0.035179450386910616 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.7, "acc_stderr": 0.046056618647183814, "acc_norm": 0.7, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7818181818181819, "acc_stderr": 0.03225078108306289, "acc_norm": 0.7818181818181819, "acc_norm_stderr": 0.03225078108306289 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.797979797979798, "acc_stderr": 0.028606204289229865, "acc_norm": 0.797979797979798, "acc_norm_stderr": 0.028606204289229865 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.9015544041450777, "acc_stderr": 0.021500249576033456, "acc_norm": 0.9015544041450777, "acc_norm_stderr": 0.021500249576033456 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.6487179487179487, "acc_stderr": 0.024203665177902803, "acc_norm": 0.6487179487179487, "acc_norm_stderr": 0.024203665177902803 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.35555555555555557, "acc_stderr": 0.02918571494985741, "acc_norm": 0.35555555555555557, "acc_norm_stderr": 0.02918571494985741 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.680672268907563, "acc_stderr": 0.030283995525884396, "acc_norm": 0.680672268907563, "acc_norm_stderr": 0.030283995525884396 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.33774834437086093, "acc_stderr": 0.038615575462551684, "acc_norm": 0.33774834437086093, "acc_norm_stderr": 0.038615575462551684 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8440366972477065, "acc_stderr": 0.015555802713590167, "acc_norm": 0.8440366972477065, "acc_norm_stderr": 0.015555802713590167 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.5092592592592593, "acc_stderr": 0.034093869469927006, "acc_norm": 0.5092592592592593, "acc_norm_stderr": 0.034093869469927006 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.8235294117647058, "acc_stderr": 0.02675640153807897, "acc_norm": 0.8235294117647058, "acc_norm_stderr": 0.02675640153807897 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.810126582278481, "acc_stderr": 0.025530100460233494, "acc_norm": 0.810126582278481, "acc_norm_stderr": 0.025530100460233494 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6860986547085202, "acc_stderr": 0.031146796482972465, "acc_norm": 0.6860986547085202, "acc_norm_stderr": 0.031146796482972465 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.7938931297709924, "acc_stderr": 0.03547771004159464, "acc_norm": 0.7938931297709924, "acc_norm_stderr": 0.03547771004159464 }, "harness|hendrycksTest-international_law|5": { "acc": 0.7768595041322314, "acc_stderr": 0.03800754475228733, "acc_norm": 0.7768595041322314, "acc_norm_stderr": 0.03800754475228733 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.75, "acc_stderr": 0.04186091791394607, "acc_norm": 0.75, "acc_norm_stderr": 0.04186091791394607 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7914110429447853, "acc_stderr": 0.03192193448934724, "acc_norm": 0.7914110429447853, "acc_norm_stderr": 0.03192193448934724 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.48214285714285715, "acc_stderr": 0.047427623612430116, "acc_norm": 0.48214285714285715, "acc_norm_stderr": 0.047427623612430116 }, "harness|hendrycksTest-management|5": { "acc": 0.8058252427184466, "acc_stderr": 0.03916667762822584, "acc_norm": 0.8058252427184466, "acc_norm_stderr": 0.03916667762822584 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8632478632478633, "acc_stderr": 0.022509033937077802, "acc_norm": 0.8632478632478633, "acc_norm_stderr": 0.022509033937077802 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.71, "acc_stderr": 0.045604802157206845, "acc_norm": 0.71, "acc_norm_stderr": 0.045604802157206845 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8288633461047255, "acc_stderr": 0.013468201614066307, "acc_norm": 0.8288633461047255, "acc_norm_stderr": 0.013468201614066307 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.7312138728323699, "acc_stderr": 0.023868003262500104, "acc_norm": 0.7312138728323699, "acc_norm_stderr": 0.023868003262500104 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.3854748603351955, "acc_stderr": 0.01627792703963819, "acc_norm": 0.3854748603351955, "acc_norm_stderr": 0.01627792703963819 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.7352941176470589, "acc_stderr": 0.025261691219729477, "acc_norm": 0.7352941176470589, "acc_norm_stderr": 0.025261691219729477 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.7106109324758842, "acc_stderr": 0.025755865922632945, "acc_norm": 0.7106109324758842, "acc_norm_stderr": 0.025755865922632945 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.7561728395061729, "acc_stderr": 0.023891879541959603, "acc_norm": 0.7561728395061729, "acc_norm_stderr": 0.023891879541959603 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.4929078014184397, "acc_stderr": 0.02982449855912901, "acc_norm": 0.4929078014184397, "acc_norm_stderr": 0.02982449855912901 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.47392438070404175, "acc_stderr": 0.012752858346533131, "acc_norm": 0.47392438070404175, "acc_norm_stderr": 0.012752858346533131 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.6875, "acc_stderr": 0.02815637344037142, "acc_norm": 0.6875, "acc_norm_stderr": 0.02815637344037142 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.673202614379085, "acc_stderr": 0.018975427920507208, "acc_norm": 0.673202614379085, "acc_norm_stderr": 0.018975427920507208 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6636363636363637, "acc_stderr": 0.04525393596302506, "acc_norm": 0.6636363636363637, "acc_norm_stderr": 0.04525393596302506 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.7387755102040816, "acc_stderr": 0.02812342933514278, "acc_norm": 0.7387755102040816, "acc_norm_stderr": 0.02812342933514278 }, "harness|hendrycksTest-sociology|5": { "acc": 0.835820895522388, "acc_stderr": 0.026193923544454115, "acc_norm": 0.835820895522388, "acc_norm_stderr": 0.026193923544454115 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.84, "acc_stderr": 0.03684529491774708, "acc_norm": 0.84, "acc_norm_stderr": 0.03684529491774708 }, "harness|hendrycksTest-virology|5": { "acc": 0.5542168674698795, "acc_stderr": 0.03869543323472101, "acc_norm": 0.5542168674698795, "acc_norm_stderr": 0.03869543323472101 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.847953216374269, "acc_stderr": 0.027539122889061456, "acc_norm": 0.847953216374269, "acc_norm_stderr": 0.027539122889061456 }, "harness|truthfulqa:mc|0": { "mc1": 0.4455324357405141, "mc1_stderr": 0.017399335280140354, "mc2": 0.609312831167026, "mc2_stderr": 0.015494903078684579 }, "harness|winogrande|5": { "acc": 0.7947908445146015, "acc_stderr": 0.011350315707462057 }, "harness|gsm8k|5": { "acc": 0.6982562547384382, "acc_stderr": 0.012643544762873354 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_Stopwolf__DistilabelCerberus-7B-slerp
[ "region:us" ]
2024-02-02T01:31:02+00:00
{"pretty_name": "Evaluation run of Stopwolf/DistilabelCerberus-7B-slerp", "dataset_summary": "Dataset automatically created during the evaluation run of model [Stopwolf/DistilabelCerberus-7B-slerp](https://huggingface.co/Stopwolf/DistilabelCerberus-7B-slerp) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Stopwolf__DistilabelCerberus-7B-slerp\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-02T01:28:41.378025](https://huggingface.co/datasets/open-llm-leaderboard/details_Stopwolf__DistilabelCerberus-7B-slerp/blob/main/results_2024-02-02T01-28-41.378025.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6464333239348551,\n \"acc_stderr\": 0.032147073947899604,\n \"acc_norm\": 0.6464670335536932,\n \"acc_norm_stderr\": 0.03280457322475929,\n \"mc1\": 0.4455324357405141,\n \"mc1_stderr\": 0.017399335280140354,\n \"mc2\": 0.609312831167026,\n \"mc2_stderr\": 0.015494903078684579\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6544368600682594,\n \"acc_stderr\": 0.013896938461145687,\n \"acc_norm\": 0.681740614334471,\n \"acc_norm_stderr\": 0.013611993916971453\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6928898625771759,\n \"acc_stderr\": 0.004603527017557838,\n \"acc_norm\": 0.867755427205736,\n \"acc_norm_stderr\": 0.003380641470989925\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.562962962962963,\n \"acc_stderr\": 0.04284958639753401,\n \"acc_norm\": 0.562962962962963,\n \"acc_norm_stderr\": 0.04284958639753401\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7039473684210527,\n \"acc_stderr\": 0.03715062154998904,\n \"acc_norm\": 0.7039473684210527,\n \"acc_norm_stderr\": 0.03715062154998904\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.62,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.62,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6792452830188679,\n \"acc_stderr\": 0.028727502957880267,\n \"acc_norm\": 0.6792452830188679,\n \"acc_norm_stderr\": 0.028727502957880267\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7638888888888888,\n \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.7638888888888888,\n \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.630057803468208,\n \"acc_stderr\": 0.036812296333943194,\n \"acc_norm\": 0.630057803468208,\n \"acc_norm_stderr\": 0.036812296333943194\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4019607843137255,\n \"acc_stderr\": 0.04878608714466996,\n \"acc_norm\": 0.4019607843137255,\n \"acc_norm_stderr\": 0.04878608714466996\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5617021276595745,\n \"acc_stderr\": 0.03243618636108101,\n \"acc_norm\": 0.5617021276595745,\n \"acc_norm_stderr\": 0.03243618636108101\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.047036043419179864,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.047036043419179864\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5379310344827586,\n \"acc_stderr\": 0.04154659671707548,\n \"acc_norm\": 0.5379310344827586,\n \"acc_norm_stderr\": 0.04154659671707548\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.41798941798941797,\n \"acc_stderr\": 0.025402555503260912,\n \"acc_norm\": 0.41798941798941797,\n \"acc_norm_stderr\": 0.025402555503260912\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4603174603174603,\n \"acc_stderr\": 0.04458029125470973,\n \"acc_norm\": 0.4603174603174603,\n \"acc_norm_stderr\": 0.04458029125470973\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7935483870967742,\n \"acc_stderr\": 0.023025899617188723,\n \"acc_norm\": 0.7935483870967742,\n \"acc_norm_stderr\": 0.023025899617188723\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5024630541871922,\n \"acc_stderr\": 0.035179450386910616,\n \"acc_norm\": 0.5024630541871922,\n \"acc_norm_stderr\": 0.035179450386910616\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7818181818181819,\n \"acc_stderr\": 0.03225078108306289,\n \"acc_norm\": 0.7818181818181819,\n \"acc_norm_stderr\": 0.03225078108306289\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.797979797979798,\n \"acc_stderr\": 0.028606204289229865,\n \"acc_norm\": 0.797979797979798,\n \"acc_norm_stderr\": 0.028606204289229865\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9015544041450777,\n \"acc_stderr\": 0.021500249576033456,\n \"acc_norm\": 0.9015544041450777,\n \"acc_norm_stderr\": 0.021500249576033456\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6487179487179487,\n \"acc_stderr\": 0.024203665177902803,\n \"acc_norm\": 0.6487179487179487,\n \"acc_norm_stderr\": 0.024203665177902803\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.35555555555555557,\n \"acc_stderr\": 0.02918571494985741,\n \"acc_norm\": 0.35555555555555557,\n \"acc_norm_stderr\": 0.02918571494985741\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.680672268907563,\n \"acc_stderr\": 0.030283995525884396,\n \"acc_norm\": 0.680672268907563,\n \"acc_norm_stderr\": 0.030283995525884396\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.33774834437086093,\n \"acc_stderr\": 0.038615575462551684,\n \"acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.038615575462551684\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8440366972477065,\n \"acc_stderr\": 0.015555802713590167,\n \"acc_norm\": 0.8440366972477065,\n \"acc_norm_stderr\": 0.015555802713590167\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5092592592592593,\n \"acc_stderr\": 0.034093869469927006,\n \"acc_norm\": 0.5092592592592593,\n \"acc_norm_stderr\": 0.034093869469927006\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8235294117647058,\n \"acc_stderr\": 0.02675640153807897,\n \"acc_norm\": 0.8235294117647058,\n \"acc_norm_stderr\": 0.02675640153807897\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.810126582278481,\n \"acc_stderr\": 0.025530100460233494,\n \"acc_norm\": 0.810126582278481,\n \"acc_norm_stderr\": 0.025530100460233494\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7938931297709924,\n \"acc_stderr\": 0.03547771004159464,\n \"acc_norm\": 0.7938931297709924,\n \"acc_norm_stderr\": 0.03547771004159464\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228733,\n \"acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228733\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04186091791394607,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04186091791394607\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7914110429447853,\n \"acc_stderr\": 0.03192193448934724,\n \"acc_norm\": 0.7914110429447853,\n \"acc_norm_stderr\": 0.03192193448934724\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.48214285714285715,\n \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.48214285714285715,\n \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8058252427184466,\n \"acc_stderr\": 0.03916667762822584,\n \"acc_norm\": 0.8058252427184466,\n \"acc_norm_stderr\": 0.03916667762822584\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8632478632478633,\n \"acc_stderr\": 0.022509033937077802,\n \"acc_norm\": 0.8632478632478633,\n \"acc_norm_stderr\": 0.022509033937077802\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8288633461047255,\n \"acc_stderr\": 0.013468201614066307,\n \"acc_norm\": 0.8288633461047255,\n \"acc_norm_stderr\": 0.013468201614066307\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7312138728323699,\n \"acc_stderr\": 0.023868003262500104,\n \"acc_norm\": 0.7312138728323699,\n \"acc_norm_stderr\": 0.023868003262500104\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3854748603351955,\n \"acc_stderr\": 0.01627792703963819,\n \"acc_norm\": 0.3854748603351955,\n \"acc_norm_stderr\": 0.01627792703963819\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7352941176470589,\n \"acc_stderr\": 0.025261691219729477,\n \"acc_norm\": 0.7352941176470589,\n \"acc_norm_stderr\": 0.025261691219729477\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7106109324758842,\n \"acc_stderr\": 0.025755865922632945,\n \"acc_norm\": 0.7106109324758842,\n \"acc_norm_stderr\": 0.025755865922632945\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7561728395061729,\n \"acc_stderr\": 0.023891879541959603,\n \"acc_norm\": 0.7561728395061729,\n \"acc_norm_stderr\": 0.023891879541959603\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4929078014184397,\n \"acc_stderr\": 0.02982449855912901,\n \"acc_norm\": 0.4929078014184397,\n \"acc_norm_stderr\": 0.02982449855912901\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.47392438070404175,\n \"acc_stderr\": 0.012752858346533131,\n \"acc_norm\": 0.47392438070404175,\n \"acc_norm_stderr\": 0.012752858346533131\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6875,\n \"acc_stderr\": 0.02815637344037142,\n \"acc_norm\": 0.6875,\n \"acc_norm_stderr\": 0.02815637344037142\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.673202614379085,\n \"acc_stderr\": 0.018975427920507208,\n \"acc_norm\": 0.673202614379085,\n \"acc_norm_stderr\": 0.018975427920507208\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7387755102040816,\n \"acc_stderr\": 0.02812342933514278,\n \"acc_norm\": 0.7387755102040816,\n \"acc_norm_stderr\": 0.02812342933514278\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n \"acc_stderr\": 0.026193923544454115,\n \"acc_norm\": 0.835820895522388,\n \"acc_norm_stderr\": 0.026193923544454115\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774708,\n \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774708\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5542168674698795,\n \"acc_stderr\": 0.03869543323472101,\n \"acc_norm\": 0.5542168674698795,\n \"acc_norm_stderr\": 0.03869543323472101\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.847953216374269,\n \"acc_stderr\": 0.027539122889061456,\n \"acc_norm\": 0.847953216374269,\n \"acc_norm_stderr\": 0.027539122889061456\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4455324357405141,\n \"mc1_stderr\": 0.017399335280140354,\n \"mc2\": 0.609312831167026,\n \"mc2_stderr\": 0.015494903078684579\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7947908445146015,\n \"acc_stderr\": 0.011350315707462057\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6982562547384382,\n \"acc_stderr\": 0.012643544762873354\n }\n}\n```", "repo_url": "https://huggingface.co/Stopwolf/DistilabelCerberus-7B-slerp", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_02T01_28_41.378025", "path": ["**/details_harness|arc:challenge|25_2024-02-02T01-28-41.378025.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-02T01-28-41.378025.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_02T01_28_41.378025", "path": ["**/details_harness|gsm8k|5_2024-02-02T01-28-41.378025.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-02T01-28-41.378025.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_02T01_28_41.378025", "path": ["**/details_harness|hellaswag|10_2024-02-02T01-28-41.378025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-02T01-28-41.378025.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_02T01_28_41.378025", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T01-28-41.378025.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-02T01-28-41.378025.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-02T01-28-41.378025.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T01-28-41.378025.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T01-28-41.378025.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-02T01-28-41.378025.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T01-28-41.378025.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T01-28-41.378025.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T01-28-41.378025.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T01-28-41.378025.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-02T01-28-41.378025.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-02T01-28-41.378025.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T01-28-41.378025.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-02T01-28-41.378025.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T01-28-41.378025.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T01-28-41.378025.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T01-28-41.378025.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-02T01-28-41.378025.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T01-28-41.378025.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T01-28-41.378025.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T01-28-41.378025.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T01-28-41.378025.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T01-28-41.378025.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T01-28-41.378025.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T01-28-41.378025.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T01-28-41.378025.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T01-28-41.378025.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T01-28-41.378025.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T01-28-41.378025.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T01-28-41.378025.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T01-28-41.378025.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T01-28-41.378025.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-02T01-28-41.378025.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T01-28-41.378025.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-02T01-28-41.378025.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T01-28-41.378025.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T01-28-41.378025.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T01-28-41.378025.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-02T01-28-41.378025.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-02T01-28-41.378025.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T01-28-41.378025.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T01-28-41.378025.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T01-28-41.378025.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T01-28-41.378025.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-02T01-28-41.378025.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-02T01-28-41.378025.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-02T01-28-41.378025.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T01-28-41.378025.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-02T01-28-41.378025.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T01-28-41.378025.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T01-28-41.378025.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-02T01-28-41.378025.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-02T01-28-41.378025.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-02T01-28-41.378025.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T01-28-41.378025.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-02T01-28-41.378025.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-02T01-28-41.378025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T01-28-41.378025.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-02T01-28-41.378025.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-02T01-28-41.378025.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T01-28-41.378025.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T01-28-41.378025.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-02T01-28-41.378025.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T01-28-41.378025.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T01-28-41.378025.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T01-28-41.378025.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T01-28-41.378025.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-02T01-28-41.378025.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-02T01-28-41.378025.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T01-28-41.378025.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-02T01-28-41.378025.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T01-28-41.378025.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T01-28-41.378025.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T01-28-41.378025.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-02T01-28-41.378025.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T01-28-41.378025.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T01-28-41.378025.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T01-28-41.378025.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T01-28-41.378025.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T01-28-41.378025.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T01-28-41.378025.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T01-28-41.378025.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T01-28-41.378025.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T01-28-41.378025.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T01-28-41.378025.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T01-28-41.378025.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T01-28-41.378025.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T01-28-41.378025.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T01-28-41.378025.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-02T01-28-41.378025.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T01-28-41.378025.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-02T01-28-41.378025.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T01-28-41.378025.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T01-28-41.378025.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T01-28-41.378025.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-02T01-28-41.378025.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-02T01-28-41.378025.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T01-28-41.378025.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T01-28-41.378025.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T01-28-41.378025.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T01-28-41.378025.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-02T01-28-41.378025.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-02T01-28-41.378025.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-02T01-28-41.378025.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T01-28-41.378025.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-02T01-28-41.378025.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T01-28-41.378025.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T01-28-41.378025.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-02T01-28-41.378025.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-02T01-28-41.378025.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-02T01-28-41.378025.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T01-28-41.378025.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-02T01-28-41.378025.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-02T01-28-41.378025.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_02T01_28_41.378025", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T01-28-41.378025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T01-28-41.378025.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_02T01_28_41.378025", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-02T01-28-41.378025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-02T01-28-41.378025.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_02T01_28_41.378025", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-02T01-28-41.378025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-02T01-28-41.378025.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_02T01_28_41.378025", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T01-28-41.378025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T01-28-41.378025.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_02T01_28_41.378025", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T01-28-41.378025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T01-28-41.378025.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_02T01_28_41.378025", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-02T01-28-41.378025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-02T01-28-41.378025.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_02T01_28_41.378025", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T01-28-41.378025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T01-28-41.378025.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_02T01_28_41.378025", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T01-28-41.378025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T01-28-41.378025.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_02T01_28_41.378025", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T01-28-41.378025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T01-28-41.378025.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_02T01_28_41.378025", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T01-28-41.378025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T01-28-41.378025.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_02T01_28_41.378025", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-02T01-28-41.378025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-02T01-28-41.378025.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_02T01_28_41.378025", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-02T01-28-41.378025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-02T01-28-41.378025.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_02T01_28_41.378025", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T01-28-41.378025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T01-28-41.378025.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_02T01_28_41.378025", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-02T01-28-41.378025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-02T01-28-41.378025.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_02T01_28_41.378025", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T01-28-41.378025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T01-28-41.378025.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_02T01_28_41.378025", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T01-28-41.378025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T01-28-41.378025.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_02T01_28_41.378025", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T01-28-41.378025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T01-28-41.378025.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_02T01_28_41.378025", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-02T01-28-41.378025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-02T01-28-41.378025.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_02T01_28_41.378025", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T01-28-41.378025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T01-28-41.378025.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_02T01_28_41.378025", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T01-28-41.378025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T01-28-41.378025.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_02T01_28_41.378025", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T01-28-41.378025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T01-28-41.378025.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_02T01_28_41.378025", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T01-28-41.378025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T01-28-41.378025.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_02T01_28_41.378025", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T01-28-41.378025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T01-28-41.378025.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_02T01_28_41.378025", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T01-28-41.378025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T01-28-41.378025.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_02T01_28_41.378025", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T01-28-41.378025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T01-28-41.378025.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_02T01_28_41.378025", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T01-28-41.378025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T01-28-41.378025.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_02T01_28_41.378025", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T01-28-41.378025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T01-28-41.378025.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_02T01_28_41.378025", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T01-28-41.378025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T01-28-41.378025.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_02T01_28_41.378025", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T01-28-41.378025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T01-28-41.378025.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_02T01_28_41.378025", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T01-28-41.378025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T01-28-41.378025.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_02T01_28_41.378025", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T01-28-41.378025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T01-28-41.378025.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_02T01_28_41.378025", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T01-28-41.378025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T01-28-41.378025.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_02T01_28_41.378025", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-02T01-28-41.378025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-02T01-28-41.378025.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_02T01_28_41.378025", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T01-28-41.378025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T01-28-41.378025.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_02T01_28_41.378025", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-02T01-28-41.378025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-02T01-28-41.378025.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_02T01_28_41.378025", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T01-28-41.378025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T01-28-41.378025.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_02T01_28_41.378025", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T01-28-41.378025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T01-28-41.378025.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_02T01_28_41.378025", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T01-28-41.378025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T01-28-41.378025.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_02T01_28_41.378025", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-02T01-28-41.378025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-02T01-28-41.378025.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_02T01_28_41.378025", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-02T01-28-41.378025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-02T01-28-41.378025.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_02T01_28_41.378025", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T01-28-41.378025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T01-28-41.378025.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_02T01_28_41.378025", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T01-28-41.378025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T01-28-41.378025.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_02T01_28_41.378025", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T01-28-41.378025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T01-28-41.378025.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_02T01_28_41.378025", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T01-28-41.378025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T01-28-41.378025.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_02T01_28_41.378025", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-02T01-28-41.378025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-02T01-28-41.378025.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_02T01_28_41.378025", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-02T01-28-41.378025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-02T01-28-41.378025.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_02T01_28_41.378025", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-02T01-28-41.378025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-02T01-28-41.378025.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_02T01_28_41.378025", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T01-28-41.378025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T01-28-41.378025.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_02T01_28_41.378025", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-02T01-28-41.378025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-02T01-28-41.378025.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_02T01_28_41.378025", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T01-28-41.378025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T01-28-41.378025.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_02T01_28_41.378025", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T01-28-41.378025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T01-28-41.378025.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_02T01_28_41.378025", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-02T01-28-41.378025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-02T01-28-41.378025.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_02T01_28_41.378025", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-02T01-28-41.378025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-02T01-28-41.378025.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_02T01_28_41.378025", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-02T01-28-41.378025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-02T01-28-41.378025.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_02T01_28_41.378025", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T01-28-41.378025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T01-28-41.378025.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_02T01_28_41.378025", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-02T01-28-41.378025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-02T01-28-41.378025.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_02T01_28_41.378025", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-02T01-28-41.378025.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-02T01-28-41.378025.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_02T01_28_41.378025", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-02T01-28-41.378025.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-02T01-28-41.378025.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_02T01_28_41.378025", "path": ["**/details_harness|winogrande|5_2024-02-02T01-28-41.378025.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-02T01-28-41.378025.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_02T01_28_41.378025", "path": ["results_2024-02-02T01-28-41.378025.parquet"]}, {"split": "latest", "path": ["results_2024-02-02T01-28-41.378025.parquet"]}]}]}
2024-02-02T01:31:27+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of Stopwolf/DistilabelCerberus-7B-slerp Dataset automatically created during the evaluation run of model Stopwolf/DistilabelCerberus-7B-slerp on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-02-02T01:28:41.378025(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of Stopwolf/DistilabelCerberus-7B-slerp\n\n\n\nDataset automatically created during the evaluation run of model Stopwolf/DistilabelCerberus-7B-slerp on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-02-02T01:28:41.378025(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of Stopwolf/DistilabelCerberus-7B-slerp\n\n\n\nDataset automatically created during the evaluation run of model Stopwolf/DistilabelCerberus-7B-slerp on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-02-02T01:28:41.378025(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
cb32efb43712f88f4b980c3db2552425270a4c1e
# Dataset Card for "context_pubmed" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
lhallee/context_pubmed
[ "region:us" ]
2024-02-02T01:37:30+00:00
{"dataset_info": {"features": [{"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 1731507128, "num_examples": 1270940}], "download_size": 983766153, "dataset_size": 1731507128}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
2024-02-02T01:38:26+00:00
[]
[]
TAGS #region-us
# Dataset Card for "context_pubmed" More Information needed
[ "# Dataset Card for \"context_pubmed\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"context_pubmed\"\n\nMore Information needed" ]
1dd88ca89fbe2945e6985d5f276087fe8fba8033
# Dataset Card for Evaluation run of 0x7194633/fialka-13B-v4 <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [0x7194633/fialka-13B-v4](https://huggingface.co/0x7194633/fialka-13B-v4) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_0x7194633__fialka-13B-v4", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-02-02T02:04:36.261546](https://huggingface.co/datasets/open-llm-leaderboard/details_0x7194633__fialka-13B-v4/blob/main/results_2024-02-02T02-04-36.261546.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.25490120999301497, "acc_stderr": 0.030530419104320212, "acc_norm": 0.2554295986850323, "acc_norm_stderr": 0.03130993030375039, "mc1": 0.2729498164014688, "mc1_stderr": 0.015594753632006523, "mc2": 0.43649124278295964, "mc2_stderr": 0.014606909564780788 }, "harness|arc:challenge|25": { "acc": 0.2738907849829352, "acc_stderr": 0.013032004972989503, "acc_norm": 0.29692832764505117, "acc_norm_stderr": 0.013352025976725223 }, "harness|hellaswag|10": { "acc": 0.38279227245568614, "acc_stderr": 0.0048507486878599255, "acc_norm": 0.4737104162517427, "acc_norm_stderr": 0.004982879340691398 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.26, "acc_stderr": 0.04408440022768078, "acc_norm": 0.26, "acc_norm_stderr": 0.04408440022768078 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.2962962962962963, "acc_stderr": 0.03944624162501116, "acc_norm": 0.2962962962962963, "acc_norm_stderr": 0.03944624162501116 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.21710526315789475, "acc_stderr": 0.033550453048829226, "acc_norm": 0.21710526315789475, "acc_norm_stderr": 0.033550453048829226 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.23, "acc_stderr": 0.04229525846816505, "acc_norm": 0.23, "acc_norm_stderr": 0.04229525846816505 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.21132075471698114, "acc_stderr": 0.02512576648482784, "acc_norm": 0.21132075471698114, "acc_norm_stderr": 0.02512576648482784 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.2708333333333333, "acc_stderr": 0.037161774375660164, "acc_norm": 0.2708333333333333, "acc_norm_stderr": 0.037161774375660164 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.24, "acc_stderr": 0.042923469599092816, "acc_norm": 0.24, "acc_norm_stderr": 0.042923469599092816 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.33, "acc_stderr": 0.047258156262526045, "acc_norm": 0.33, "acc_norm_stderr": 0.047258156262526045 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.27, "acc_stderr": 0.0446196043338474, "acc_norm": 0.27, "acc_norm_stderr": 0.0446196043338474 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.3063583815028902, "acc_stderr": 0.03514942551267438, "acc_norm": 0.3063583815028902, "acc_norm_stderr": 0.03514942551267438 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.21568627450980393, "acc_stderr": 0.04092563958237656, "acc_norm": 0.21568627450980393, "acc_norm_stderr": 0.04092563958237656 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.23, "acc_stderr": 0.04229525846816505, "acc_norm": 0.23, "acc_norm_stderr": 0.04229525846816505 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.23404255319148937, "acc_stderr": 0.027678452578212387, "acc_norm": 0.23404255319148937, "acc_norm_stderr": 0.027678452578212387 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.2631578947368421, "acc_stderr": 0.0414243971948936, "acc_norm": 0.2631578947368421, "acc_norm_stderr": 0.0414243971948936 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.2413793103448276, "acc_stderr": 0.03565998174135303, "acc_norm": 0.2413793103448276, "acc_norm_stderr": 0.03565998174135303 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.24074074074074073, "acc_stderr": 0.022019080012217897, "acc_norm": 0.24074074074074073, "acc_norm_stderr": 0.022019080012217897 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.18253968253968253, "acc_stderr": 0.03455071019102146, "acc_norm": 0.18253968253968253, "acc_norm_stderr": 0.03455071019102146 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.16, "acc_stderr": 0.03684529491774708, "acc_norm": 0.16, "acc_norm_stderr": 0.03684529491774708 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.3193548387096774, "acc_stderr": 0.02652270967466777, "acc_norm": 0.3193548387096774, "acc_norm_stderr": 0.02652270967466777 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.270935960591133, "acc_stderr": 0.03127090713297698, "acc_norm": 0.270935960591133, "acc_norm_stderr": 0.03127090713297698 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.15, "acc_stderr": 0.0358870281282637, "acc_norm": 0.15, "acc_norm_stderr": 0.0358870281282637 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.23636363636363636, "acc_stderr": 0.03317505930009179, "acc_norm": 0.23636363636363636, "acc_norm_stderr": 0.03317505930009179 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.3181818181818182, "acc_stderr": 0.03318477333845331, "acc_norm": 0.3181818181818182, "acc_norm_stderr": 0.03318477333845331 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.29533678756476683, "acc_stderr": 0.03292296639155139, "acc_norm": 0.29533678756476683, "acc_norm_stderr": 0.03292296639155139 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.23333333333333334, "acc_stderr": 0.021444547301560476, "acc_norm": 0.23333333333333334, "acc_norm_stderr": 0.021444547301560476 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.26666666666666666, "acc_stderr": 0.026962424325073835, "acc_norm": 0.26666666666666666, "acc_norm_stderr": 0.026962424325073835 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.23949579831932774, "acc_stderr": 0.02772206549336126, "acc_norm": 0.23949579831932774, "acc_norm_stderr": 0.02772206549336126 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.32450331125827814, "acc_stderr": 0.038227469376587525, "acc_norm": 0.32450331125827814, "acc_norm_stderr": 0.038227469376587525 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.23119266055045873, "acc_stderr": 0.018075750241633156, "acc_norm": 0.23119266055045873, "acc_norm_stderr": 0.018075750241633156 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.4722222222222222, "acc_stderr": 0.0340470532865388, "acc_norm": 0.4722222222222222, "acc_norm_stderr": 0.0340470532865388 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.20588235294117646, "acc_stderr": 0.028379449451588674, "acc_norm": 0.20588235294117646, "acc_norm_stderr": 0.028379449451588674 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.25316455696202533, "acc_stderr": 0.028304657943035303, "acc_norm": 0.25316455696202533, "acc_norm_stderr": 0.028304657943035303 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.2242152466367713, "acc_stderr": 0.02799153425851952, "acc_norm": 0.2242152466367713, "acc_norm_stderr": 0.02799153425851952 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.2366412213740458, "acc_stderr": 0.037276735755969174, "acc_norm": 0.2366412213740458, "acc_norm_stderr": 0.037276735755969174 }, "harness|hendrycksTest-international_law|5": { "acc": 0.2727272727272727, "acc_stderr": 0.04065578140908705, "acc_norm": 0.2727272727272727, "acc_norm_stderr": 0.04065578140908705 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.23148148148148148, "acc_stderr": 0.04077494709252628, "acc_norm": 0.23148148148148148, "acc_norm_stderr": 0.04077494709252628 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.22085889570552147, "acc_stderr": 0.03259177392742179, "acc_norm": 0.22085889570552147, "acc_norm_stderr": 0.03259177392742179 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.22321428571428573, "acc_stderr": 0.039523019677025116, "acc_norm": 0.22321428571428573, "acc_norm_stderr": 0.039523019677025116 }, "harness|hendrycksTest-management|5": { "acc": 0.1553398058252427, "acc_stderr": 0.035865947385739734, "acc_norm": 0.1553398058252427, "acc_norm_stderr": 0.035865947385739734 }, "harness|hendrycksTest-marketing|5": { "acc": 0.21794871794871795, "acc_stderr": 0.027046857630716677, "acc_norm": 0.21794871794871795, "acc_norm_stderr": 0.027046857630716677 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.28, "acc_stderr": 0.04512608598542127, "acc_norm": 0.28, "acc_norm_stderr": 0.04512608598542127 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.2541507024265645, "acc_stderr": 0.01556925469204577, "acc_norm": 0.2541507024265645, "acc_norm_stderr": 0.01556925469204577 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.22254335260115607, "acc_stderr": 0.02239421566194282, "acc_norm": 0.22254335260115607, "acc_norm_stderr": 0.02239421566194282 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.2446927374301676, "acc_stderr": 0.014378169884098407, "acc_norm": 0.2446927374301676, "acc_norm_stderr": 0.014378169884098407 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.3006535947712418, "acc_stderr": 0.02625605383571896, "acc_norm": 0.3006535947712418, "acc_norm_stderr": 0.02625605383571896 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.1864951768488746, "acc_stderr": 0.022122439772480774, "acc_norm": 0.1864951768488746, "acc_norm_stderr": 0.022122439772480774 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.22839506172839505, "acc_stderr": 0.023358211840626267, "acc_norm": 0.22839506172839505, "acc_norm_stderr": 0.023358211840626267 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.2553191489361702, "acc_stderr": 0.026011992930902013, "acc_norm": 0.2553191489361702, "acc_norm_stderr": 0.026011992930902013 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.2588005215123859, "acc_stderr": 0.01118610904656461, "acc_norm": 0.2588005215123859, "acc_norm_stderr": 0.01118610904656461 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.4375, "acc_stderr": 0.030134614954403924, "acc_norm": 0.4375, "acc_norm_stderr": 0.030134614954403924 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.2581699346405229, "acc_stderr": 0.017704531653250068, "acc_norm": 0.2581699346405229, "acc_norm_stderr": 0.017704531653250068 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.2818181818181818, "acc_stderr": 0.043091187099464585, "acc_norm": 0.2818181818181818, "acc_norm_stderr": 0.043091187099464585 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.2571428571428571, "acc_stderr": 0.02797982353874455, "acc_norm": 0.2571428571428571, "acc_norm_stderr": 0.02797982353874455 }, "harness|hendrycksTest-sociology|5": { "acc": 0.18407960199004975, "acc_stderr": 0.027403859410786834, "acc_norm": 0.18407960199004975, "acc_norm_stderr": 0.027403859410786834 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.26, "acc_stderr": 0.04408440022768078, "acc_norm": 0.26, "acc_norm_stderr": 0.04408440022768078 }, "harness|hendrycksTest-virology|5": { "acc": 0.18674698795180722, "acc_stderr": 0.030338749144500597, "acc_norm": 0.18674698795180722, "acc_norm_stderr": 0.030338749144500597 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.2046783625730994, "acc_stderr": 0.030944459778533193, "acc_norm": 0.2046783625730994, "acc_norm_stderr": 0.030944459778533193 }, "harness|truthfulqa:mc|0": { "mc1": 0.2729498164014688, "mc1_stderr": 0.015594753632006523, "mc2": 0.43649124278295964, "mc2_stderr": 0.014606909564780788 }, "harness|winogrande|5": { "acc": 0.5887924230465666, "acc_stderr": 0.01382912835867687 }, "harness|gsm8k|5": { "acc": 0.0037907505686125853, "acc_stderr": 0.0016927007401501951 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_0x7194633__fialka-13B-v4
[ "region:us" ]
2024-02-02T02:06:29+00:00
{"pretty_name": "Evaluation run of 0x7194633/fialka-13B-v4", "dataset_summary": "Dataset automatically created during the evaluation run of model [0x7194633/fialka-13B-v4](https://huggingface.co/0x7194633/fialka-13B-v4) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_0x7194633__fialka-13B-v4\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-02T02:04:36.261546](https://huggingface.co/datasets/open-llm-leaderboard/details_0x7194633__fialka-13B-v4/blob/main/results_2024-02-02T02-04-36.261546.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.25490120999301497,\n \"acc_stderr\": 0.030530419104320212,\n \"acc_norm\": 0.2554295986850323,\n \"acc_norm_stderr\": 0.03130993030375039,\n \"mc1\": 0.2729498164014688,\n \"mc1_stderr\": 0.015594753632006523,\n \"mc2\": 0.43649124278295964,\n \"mc2_stderr\": 0.014606909564780788\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.2738907849829352,\n \"acc_stderr\": 0.013032004972989503,\n \"acc_norm\": 0.29692832764505117,\n \"acc_norm_stderr\": 0.013352025976725223\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.38279227245568614,\n \"acc_stderr\": 0.0048507486878599255,\n \"acc_norm\": 0.4737104162517427,\n \"acc_norm_stderr\": 0.004982879340691398\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.2962962962962963,\n \"acc_stderr\": 0.03944624162501116,\n \"acc_norm\": 0.2962962962962963,\n \"acc_norm_stderr\": 0.03944624162501116\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.21710526315789475,\n \"acc_stderr\": 0.033550453048829226,\n \"acc_norm\": 0.21710526315789475,\n \"acc_norm_stderr\": 0.033550453048829226\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816505,\n \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816505\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.21132075471698114,\n \"acc_stderr\": 0.02512576648482784,\n \"acc_norm\": 0.21132075471698114,\n \"acc_norm_stderr\": 0.02512576648482784\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2708333333333333,\n \"acc_stderr\": 0.037161774375660164,\n \"acc_norm\": 0.2708333333333333,\n \"acc_norm_stderr\": 0.037161774375660164\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.24,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.27,\n \"acc_stderr\": 0.0446196043338474,\n \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.3063583815028902,\n \"acc_stderr\": 0.03514942551267438,\n \"acc_norm\": 0.3063583815028902,\n \"acc_norm_stderr\": 0.03514942551267438\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.04092563958237656,\n \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.04092563958237656\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816505,\n \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816505\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.23404255319148937,\n \"acc_stderr\": 0.027678452578212387,\n \"acc_norm\": 0.23404255319148937,\n \"acc_norm_stderr\": 0.027678452578212387\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2631578947368421,\n \"acc_stderr\": 0.0414243971948936,\n \"acc_norm\": 0.2631578947368421,\n \"acc_norm_stderr\": 0.0414243971948936\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.2413793103448276,\n \"acc_stderr\": 0.03565998174135303,\n \"acc_norm\": 0.2413793103448276,\n \"acc_norm_stderr\": 0.03565998174135303\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.24074074074074073,\n \"acc_stderr\": 0.022019080012217897,\n \"acc_norm\": 0.24074074074074073,\n \"acc_norm_stderr\": 0.022019080012217897\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.18253968253968253,\n \"acc_stderr\": 0.03455071019102146,\n \"acc_norm\": 0.18253968253968253,\n \"acc_norm_stderr\": 0.03455071019102146\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.16,\n \"acc_stderr\": 0.03684529491774708,\n \"acc_norm\": 0.16,\n \"acc_norm_stderr\": 0.03684529491774708\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.3193548387096774,\n \"acc_stderr\": 0.02652270967466777,\n \"acc_norm\": 0.3193548387096774,\n \"acc_norm_stderr\": 0.02652270967466777\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.270935960591133,\n \"acc_stderr\": 0.03127090713297698,\n \"acc_norm\": 0.270935960591133,\n \"acc_norm_stderr\": 0.03127090713297698\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.15,\n \"acc_stderr\": 0.0358870281282637,\n \"acc_norm\": 0.15,\n \"acc_norm_stderr\": 0.0358870281282637\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.23636363636363636,\n \"acc_stderr\": 0.03317505930009179,\n \"acc_norm\": 0.23636363636363636,\n \"acc_norm_stderr\": 0.03317505930009179\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.3181818181818182,\n \"acc_stderr\": 0.03318477333845331,\n \"acc_norm\": 0.3181818181818182,\n \"acc_norm_stderr\": 0.03318477333845331\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.29533678756476683,\n \"acc_stderr\": 0.03292296639155139,\n \"acc_norm\": 0.29533678756476683,\n \"acc_norm_stderr\": 0.03292296639155139\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.23333333333333334,\n \"acc_stderr\": 0.021444547301560476,\n \"acc_norm\": 0.23333333333333334,\n \"acc_norm_stderr\": 0.021444547301560476\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.26666666666666666,\n \"acc_stderr\": 0.026962424325073835,\n \"acc_norm\": 0.26666666666666666,\n \"acc_norm_stderr\": 0.026962424325073835\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.23949579831932774,\n \"acc_stderr\": 0.02772206549336126,\n \"acc_norm\": 0.23949579831932774,\n \"acc_norm_stderr\": 0.02772206549336126\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.32450331125827814,\n \"acc_stderr\": 0.038227469376587525,\n \"acc_norm\": 0.32450331125827814,\n \"acc_norm_stderr\": 0.038227469376587525\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.23119266055045873,\n \"acc_stderr\": 0.018075750241633156,\n \"acc_norm\": 0.23119266055045873,\n \"acc_norm_stderr\": 0.018075750241633156\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4722222222222222,\n \"acc_stderr\": 0.0340470532865388,\n \"acc_norm\": 0.4722222222222222,\n \"acc_norm_stderr\": 0.0340470532865388\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.20588235294117646,\n \"acc_stderr\": 0.028379449451588674,\n \"acc_norm\": 0.20588235294117646,\n \"acc_norm_stderr\": 0.028379449451588674\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.25316455696202533,\n \"acc_stderr\": 0.028304657943035303,\n \"acc_norm\": 0.25316455696202533,\n \"acc_norm_stderr\": 0.028304657943035303\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.2242152466367713,\n \"acc_stderr\": 0.02799153425851952,\n \"acc_norm\": 0.2242152466367713,\n \"acc_norm_stderr\": 0.02799153425851952\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.2366412213740458,\n \"acc_stderr\": 0.037276735755969174,\n \"acc_norm\": 0.2366412213740458,\n \"acc_norm_stderr\": 0.037276735755969174\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.2727272727272727,\n \"acc_stderr\": 0.04065578140908705,\n \"acc_norm\": 0.2727272727272727,\n \"acc_norm_stderr\": 0.04065578140908705\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.23148148148148148,\n \"acc_stderr\": 0.04077494709252628,\n \"acc_norm\": 0.23148148148148148,\n \"acc_norm_stderr\": 0.04077494709252628\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.22085889570552147,\n \"acc_stderr\": 0.03259177392742179,\n \"acc_norm\": 0.22085889570552147,\n \"acc_norm_stderr\": 0.03259177392742179\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.22321428571428573,\n \"acc_stderr\": 0.039523019677025116,\n \"acc_norm\": 0.22321428571428573,\n \"acc_norm_stderr\": 0.039523019677025116\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.1553398058252427,\n \"acc_stderr\": 0.035865947385739734,\n \"acc_norm\": 0.1553398058252427,\n \"acc_norm_stderr\": 0.035865947385739734\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.21794871794871795,\n \"acc_stderr\": 0.027046857630716677,\n \"acc_norm\": 0.21794871794871795,\n \"acc_norm_stderr\": 0.027046857630716677\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.2541507024265645,\n \"acc_stderr\": 0.01556925469204577,\n \"acc_norm\": 0.2541507024265645,\n \"acc_norm_stderr\": 0.01556925469204577\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.22254335260115607,\n \"acc_stderr\": 0.02239421566194282,\n \"acc_norm\": 0.22254335260115607,\n \"acc_norm_stderr\": 0.02239421566194282\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2446927374301676,\n \"acc_stderr\": 0.014378169884098407,\n \"acc_norm\": 0.2446927374301676,\n \"acc_norm_stderr\": 0.014378169884098407\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.3006535947712418,\n \"acc_stderr\": 0.02625605383571896,\n \"acc_norm\": 0.3006535947712418,\n \"acc_norm_stderr\": 0.02625605383571896\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.1864951768488746,\n \"acc_stderr\": 0.022122439772480774,\n \"acc_norm\": 0.1864951768488746,\n \"acc_norm_stderr\": 0.022122439772480774\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.22839506172839505,\n \"acc_stderr\": 0.023358211840626267,\n \"acc_norm\": 0.22839506172839505,\n \"acc_norm_stderr\": 0.023358211840626267\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.2553191489361702,\n \"acc_stderr\": 0.026011992930902013,\n \"acc_norm\": 0.2553191489361702,\n \"acc_norm_stderr\": 0.026011992930902013\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2588005215123859,\n \"acc_stderr\": 0.01118610904656461,\n \"acc_norm\": 0.2588005215123859,\n \"acc_norm_stderr\": 0.01118610904656461\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.4375,\n \"acc_stderr\": 0.030134614954403924,\n \"acc_norm\": 0.4375,\n \"acc_norm_stderr\": 0.030134614954403924\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.2581699346405229,\n \"acc_stderr\": 0.017704531653250068,\n \"acc_norm\": 0.2581699346405229,\n \"acc_norm_stderr\": 0.017704531653250068\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.2818181818181818,\n \"acc_stderr\": 0.043091187099464585,\n \"acc_norm\": 0.2818181818181818,\n \"acc_norm_stderr\": 0.043091187099464585\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.2571428571428571,\n \"acc_stderr\": 0.02797982353874455,\n \"acc_norm\": 0.2571428571428571,\n \"acc_norm_stderr\": 0.02797982353874455\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.18407960199004975,\n \"acc_stderr\": 0.027403859410786834,\n \"acc_norm\": 0.18407960199004975,\n \"acc_norm_stderr\": 0.027403859410786834\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.18674698795180722,\n \"acc_stderr\": 0.030338749144500597,\n \"acc_norm\": 0.18674698795180722,\n \"acc_norm_stderr\": 0.030338749144500597\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.2046783625730994,\n \"acc_stderr\": 0.030944459778533193,\n \"acc_norm\": 0.2046783625730994,\n \"acc_norm_stderr\": 0.030944459778533193\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2729498164014688,\n \"mc1_stderr\": 0.015594753632006523,\n \"mc2\": 0.43649124278295964,\n \"mc2_stderr\": 0.014606909564780788\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5887924230465666,\n \"acc_stderr\": 0.01382912835867687\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0037907505686125853,\n \"acc_stderr\": 0.0016927007401501951\n }\n}\n```", "repo_url": "https://huggingface.co/0x7194633/fialka-13B-v4", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_02T02_04_36.261546", "path": ["**/details_harness|arc:challenge|25_2024-02-02T02-04-36.261546.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-02T02-04-36.261546.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_02T02_04_36.261546", "path": ["**/details_harness|gsm8k|5_2024-02-02T02-04-36.261546.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-02T02-04-36.261546.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_02T02_04_36.261546", "path": ["**/details_harness|hellaswag|10_2024-02-02T02-04-36.261546.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-02T02-04-36.261546.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_02T02_04_36.261546", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T02-04-36.261546.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-02T02-04-36.261546.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-02T02-04-36.261546.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T02-04-36.261546.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T02-04-36.261546.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-02T02-04-36.261546.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T02-04-36.261546.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T02-04-36.261546.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T02-04-36.261546.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T02-04-36.261546.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-02T02-04-36.261546.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-02T02-04-36.261546.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T02-04-36.261546.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-02T02-04-36.261546.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T02-04-36.261546.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T02-04-36.261546.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T02-04-36.261546.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-02T02-04-36.261546.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T02-04-36.261546.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T02-04-36.261546.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T02-04-36.261546.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T02-04-36.261546.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T02-04-36.261546.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T02-04-36.261546.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T02-04-36.261546.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T02-04-36.261546.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T02-04-36.261546.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T02-04-36.261546.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T02-04-36.261546.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T02-04-36.261546.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T02-04-36.261546.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T02-04-36.261546.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-02T02-04-36.261546.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T02-04-36.261546.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-02T02-04-36.261546.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T02-04-36.261546.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T02-04-36.261546.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T02-04-36.261546.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-02T02-04-36.261546.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-02T02-04-36.261546.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T02-04-36.261546.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T02-04-36.261546.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T02-04-36.261546.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T02-04-36.261546.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-02T02-04-36.261546.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-02T02-04-36.261546.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-02T02-04-36.261546.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T02-04-36.261546.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-02T02-04-36.261546.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T02-04-36.261546.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T02-04-36.261546.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-02T02-04-36.261546.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-02T02-04-36.261546.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-02T02-04-36.261546.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T02-04-36.261546.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-02T02-04-36.261546.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-02T02-04-36.261546.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T02-04-36.261546.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-02T02-04-36.261546.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-02T02-04-36.261546.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T02-04-36.261546.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T02-04-36.261546.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-02T02-04-36.261546.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T02-04-36.261546.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T02-04-36.261546.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T02-04-36.261546.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T02-04-36.261546.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-02T02-04-36.261546.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-02T02-04-36.261546.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T02-04-36.261546.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-02T02-04-36.261546.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T02-04-36.261546.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T02-04-36.261546.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T02-04-36.261546.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-02T02-04-36.261546.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T02-04-36.261546.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T02-04-36.261546.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T02-04-36.261546.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T02-04-36.261546.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T02-04-36.261546.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T02-04-36.261546.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T02-04-36.261546.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T02-04-36.261546.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T02-04-36.261546.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T02-04-36.261546.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T02-04-36.261546.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T02-04-36.261546.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T02-04-36.261546.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T02-04-36.261546.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-02T02-04-36.261546.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T02-04-36.261546.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-02T02-04-36.261546.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T02-04-36.261546.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T02-04-36.261546.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T02-04-36.261546.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-02T02-04-36.261546.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-02T02-04-36.261546.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T02-04-36.261546.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T02-04-36.261546.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T02-04-36.261546.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T02-04-36.261546.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-02T02-04-36.261546.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-02T02-04-36.261546.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-02T02-04-36.261546.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T02-04-36.261546.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-02T02-04-36.261546.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T02-04-36.261546.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T02-04-36.261546.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-02T02-04-36.261546.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-02T02-04-36.261546.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-02T02-04-36.261546.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T02-04-36.261546.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-02T02-04-36.261546.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-02T02-04-36.261546.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_02T02_04_36.261546", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T02-04-36.261546.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T02-04-36.261546.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_02T02_04_36.261546", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-02T02-04-36.261546.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-02T02-04-36.261546.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_02T02_04_36.261546", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-02T02-04-36.261546.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-02T02-04-36.261546.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_02T02_04_36.261546", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T02-04-36.261546.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T02-04-36.261546.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_02T02_04_36.261546", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T02-04-36.261546.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T02-04-36.261546.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_02T02_04_36.261546", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-02T02-04-36.261546.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-02T02-04-36.261546.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_02T02_04_36.261546", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T02-04-36.261546.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T02-04-36.261546.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_02T02_04_36.261546", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T02-04-36.261546.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T02-04-36.261546.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_02T02_04_36.261546", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T02-04-36.261546.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T02-04-36.261546.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_02T02_04_36.261546", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T02-04-36.261546.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T02-04-36.261546.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_02T02_04_36.261546", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-02T02-04-36.261546.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-02T02-04-36.261546.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_02T02_04_36.261546", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-02T02-04-36.261546.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-02T02-04-36.261546.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_02T02_04_36.261546", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T02-04-36.261546.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T02-04-36.261546.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_02T02_04_36.261546", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-02T02-04-36.261546.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-02T02-04-36.261546.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_02T02_04_36.261546", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T02-04-36.261546.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T02-04-36.261546.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_02T02_04_36.261546", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T02-04-36.261546.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T02-04-36.261546.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_02T02_04_36.261546", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T02-04-36.261546.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T02-04-36.261546.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_02T02_04_36.261546", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-02T02-04-36.261546.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-02T02-04-36.261546.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_02T02_04_36.261546", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T02-04-36.261546.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T02-04-36.261546.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_02T02_04_36.261546", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T02-04-36.261546.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T02-04-36.261546.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_02T02_04_36.261546", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T02-04-36.261546.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T02-04-36.261546.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_02T02_04_36.261546", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T02-04-36.261546.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T02-04-36.261546.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_02T02_04_36.261546", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T02-04-36.261546.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T02-04-36.261546.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_02T02_04_36.261546", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T02-04-36.261546.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T02-04-36.261546.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_02T02_04_36.261546", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T02-04-36.261546.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T02-04-36.261546.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_02T02_04_36.261546", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T02-04-36.261546.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T02-04-36.261546.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_02T02_04_36.261546", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T02-04-36.261546.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T02-04-36.261546.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_02T02_04_36.261546", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T02-04-36.261546.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T02-04-36.261546.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_02T02_04_36.261546", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T02-04-36.261546.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T02-04-36.261546.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_02T02_04_36.261546", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T02-04-36.261546.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T02-04-36.261546.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_02T02_04_36.261546", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T02-04-36.261546.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T02-04-36.261546.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_02T02_04_36.261546", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T02-04-36.261546.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T02-04-36.261546.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_02T02_04_36.261546", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-02T02-04-36.261546.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-02T02-04-36.261546.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_02T02_04_36.261546", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T02-04-36.261546.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T02-04-36.261546.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_02T02_04_36.261546", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-02T02-04-36.261546.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-02T02-04-36.261546.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_02T02_04_36.261546", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T02-04-36.261546.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T02-04-36.261546.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_02T02_04_36.261546", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T02-04-36.261546.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T02-04-36.261546.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_02T02_04_36.261546", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T02-04-36.261546.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T02-04-36.261546.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_02T02_04_36.261546", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-02T02-04-36.261546.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-02T02-04-36.261546.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_02T02_04_36.261546", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-02T02-04-36.261546.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-02T02-04-36.261546.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_02T02_04_36.261546", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T02-04-36.261546.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T02-04-36.261546.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_02T02_04_36.261546", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T02-04-36.261546.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T02-04-36.261546.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_02T02_04_36.261546", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T02-04-36.261546.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T02-04-36.261546.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_02T02_04_36.261546", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T02-04-36.261546.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T02-04-36.261546.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_02T02_04_36.261546", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-02T02-04-36.261546.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-02T02-04-36.261546.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_02T02_04_36.261546", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-02T02-04-36.261546.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-02T02-04-36.261546.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_02T02_04_36.261546", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-02T02-04-36.261546.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-02T02-04-36.261546.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_02T02_04_36.261546", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T02-04-36.261546.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T02-04-36.261546.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_02T02_04_36.261546", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-02T02-04-36.261546.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-02T02-04-36.261546.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_02T02_04_36.261546", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T02-04-36.261546.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T02-04-36.261546.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_02T02_04_36.261546", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T02-04-36.261546.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T02-04-36.261546.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_02T02_04_36.261546", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-02T02-04-36.261546.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-02T02-04-36.261546.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_02T02_04_36.261546", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-02T02-04-36.261546.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-02T02-04-36.261546.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_02T02_04_36.261546", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-02T02-04-36.261546.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-02T02-04-36.261546.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_02T02_04_36.261546", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T02-04-36.261546.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T02-04-36.261546.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_02T02_04_36.261546", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-02T02-04-36.261546.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-02T02-04-36.261546.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_02T02_04_36.261546", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-02T02-04-36.261546.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-02T02-04-36.261546.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_02T02_04_36.261546", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-02T02-04-36.261546.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-02T02-04-36.261546.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_02T02_04_36.261546", "path": ["**/details_harness|winogrande|5_2024-02-02T02-04-36.261546.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-02T02-04-36.261546.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_02T02_04_36.261546", "path": ["results_2024-02-02T02-04-36.261546.parquet"]}, {"split": "latest", "path": ["results_2024-02-02T02-04-36.261546.parquet"]}]}]}
2024-02-02T02:06:55+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of 0x7194633/fialka-13B-v4 Dataset automatically created during the evaluation run of model 0x7194633/fialka-13B-v4 on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-02-02T02:04:36.261546(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of 0x7194633/fialka-13B-v4\n\n\n\nDataset automatically created during the evaluation run of model 0x7194633/fialka-13B-v4 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-02-02T02:04:36.261546(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of 0x7194633/fialka-13B-v4\n\n\n\nDataset automatically created during the evaluation run of model 0x7194633/fialka-13B-v4 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-02-02T02:04:36.261546(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
b56162d14d81068f4ce1dec7a6b776af5208c244
# Dataset Card for Dataset Name <!-- Provide a quick summary of the dataset. --> This dataset card aims to be a base template for new datasets. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/datasetcard_template.md?plain=1). ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
kevinliu0619/card
[ "region:us" ]
2024-02-02T02:07:51+00:00
{}
2024-02-02T02:09:13+00:00
[]
[]
TAGS #region-us
# Dataset Card for Dataset Name This dataset card aims to be a base template for new datasets. It has been generated using this raw template. ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Dataset Name\n\n\n\nThis dataset card aims to be a base template for new datasets. It has been generated using this raw template.", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Dataset Name\n\n\n\nThis dataset card aims to be a base template for new datasets. It has been generated using this raw template.", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
f71e1f38c84fd84b852131eef19722b9a05ab326
# Dataset Card for Evaluation run of ConvexAI/Harmony-4x7B-bf16 <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [ConvexAI/Harmony-4x7B-bf16](https://huggingface.co/ConvexAI/Harmony-4x7B-bf16) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_ConvexAI__Harmony-4x7B-bf16", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-02-02T02:08:05.844408](https://huggingface.co/datasets/open-llm-leaderboard/details_ConvexAI__Harmony-4x7B-bf16/blob/main/results_2024-02-02T02-08-05.844408.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6519815241664966, "acc_stderr": 0.03207888973753625, "acc_norm": 0.6516491536204881, "acc_norm_stderr": 0.03274551770079104, "mc1": 0.4602203182374541, "mc1_stderr": 0.01744801722396088, "mc2": 0.6205565135867297, "mc2_stderr": 0.015134580676846265 }, "harness|arc:challenge|25": { "acc": 0.658703071672355, "acc_stderr": 0.013855831287497724, "acc_norm": 0.6834470989761092, "acc_norm_stderr": 0.013592431519068077 }, "harness|hellaswag|10": { "acc": 0.6810396335391357, "acc_stderr": 0.004651211311633843, "acc_norm": 0.8674566819358693, "acc_norm_stderr": 0.0033838751726700243 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.34, "acc_stderr": 0.04760952285695236, "acc_norm": 0.34, "acc_norm_stderr": 0.04760952285695236 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.6370370370370371, "acc_stderr": 0.04153948404742398, "acc_norm": 0.6370370370370371, "acc_norm_stderr": 0.04153948404742398 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.7039473684210527, "acc_stderr": 0.037150621549989056, "acc_norm": 0.7039473684210527, "acc_norm_stderr": 0.037150621549989056 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.62, "acc_stderr": 0.048783173121456316, "acc_norm": 0.62, "acc_norm_stderr": 0.048783173121456316 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.7056603773584905, "acc_stderr": 0.02804918631569525, "acc_norm": 0.7056603773584905, "acc_norm_stderr": 0.02804918631569525 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.7777777777777778, "acc_stderr": 0.03476590104304134, "acc_norm": 0.7777777777777778, "acc_norm_stderr": 0.03476590104304134 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.46, "acc_stderr": 0.05009082659620333, "acc_norm": 0.46, "acc_norm_stderr": 0.05009082659620333 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.52, "acc_stderr": 0.050211673156867795, "acc_norm": 0.52, "acc_norm_stderr": 0.050211673156867795 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.32, "acc_stderr": 0.04688261722621504, "acc_norm": 0.32, "acc_norm_stderr": 0.04688261722621504 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6763005780346821, "acc_stderr": 0.0356760379963917, "acc_norm": 0.6763005780346821, "acc_norm_stderr": 0.0356760379963917 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.4117647058823529, "acc_stderr": 0.04897104952726366, "acc_norm": 0.4117647058823529, "acc_norm_stderr": 0.04897104952726366 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.76, "acc_stderr": 0.04292346959909282, "acc_norm": 0.76, "acc_norm_stderr": 0.04292346959909282 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.5829787234042553, "acc_stderr": 0.03223276266711712, "acc_norm": 0.5829787234042553, "acc_norm_stderr": 0.03223276266711712 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.49122807017543857, "acc_stderr": 0.04702880432049615, "acc_norm": 0.49122807017543857, "acc_norm_stderr": 0.04702880432049615 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5379310344827586, "acc_stderr": 0.04154659671707548, "acc_norm": 0.5379310344827586, "acc_norm_stderr": 0.04154659671707548 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.42857142857142855, "acc_stderr": 0.025487187147859375, "acc_norm": 0.42857142857142855, "acc_norm_stderr": 0.025487187147859375 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.46825396825396826, "acc_stderr": 0.04463112720677172, "acc_norm": 0.46825396825396826, "acc_norm_stderr": 0.04463112720677172 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.33, "acc_stderr": 0.04725815626252604, "acc_norm": 0.33, "acc_norm_stderr": 0.04725815626252604 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.7709677419354839, "acc_stderr": 0.02390491431178265, "acc_norm": 0.7709677419354839, "acc_norm_stderr": 0.02390491431178265 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.47783251231527096, "acc_stderr": 0.03514528562175008, "acc_norm": 0.47783251231527096, "acc_norm_stderr": 0.03514528562175008 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.7, "acc_stderr": 0.046056618647183814, "acc_norm": 0.7, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7878787878787878, "acc_stderr": 0.031922715695483, "acc_norm": 0.7878787878787878, "acc_norm_stderr": 0.031922715695483 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.7878787878787878, "acc_stderr": 0.029126522834586815, "acc_norm": 0.7878787878787878, "acc_norm_stderr": 0.029126522834586815 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.9015544041450777, "acc_stderr": 0.021500249576033456, "acc_norm": 0.9015544041450777, "acc_norm_stderr": 0.021500249576033456 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.6666666666666666, "acc_stderr": 0.023901157979402534, "acc_norm": 0.6666666666666666, "acc_norm_stderr": 0.023901157979402534 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.37777777777777777, "acc_stderr": 0.02956070739246572, "acc_norm": 0.37777777777777777, "acc_norm_stderr": 0.02956070739246572 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.6974789915966386, "acc_stderr": 0.02983796238829193, "acc_norm": 0.6974789915966386, "acc_norm_stderr": 0.02983796238829193 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.33774834437086093, "acc_stderr": 0.03861557546255169, "acc_norm": 0.33774834437086093, "acc_norm_stderr": 0.03861557546255169 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8477064220183487, "acc_stderr": 0.015405084393157074, "acc_norm": 0.8477064220183487, "acc_norm_stderr": 0.015405084393157074 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.5370370370370371, "acc_stderr": 0.03400603625538272, "acc_norm": 0.5370370370370371, "acc_norm_stderr": 0.03400603625538272 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.8186274509803921, "acc_stderr": 0.027044621719474086, "acc_norm": 0.8186274509803921, "acc_norm_stderr": 0.027044621719474086 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.8016877637130801, "acc_stderr": 0.02595502084162113, "acc_norm": 0.8016877637130801, "acc_norm_stderr": 0.02595502084162113 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6905829596412556, "acc_stderr": 0.03102441174057221, "acc_norm": 0.6905829596412556, "acc_norm_stderr": 0.03102441174057221 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.8091603053435115, "acc_stderr": 0.034465133507525995, "acc_norm": 0.8091603053435115, "acc_norm_stderr": 0.034465133507525995 }, "harness|hendrycksTest-international_law|5": { "acc": 0.7933884297520661, "acc_stderr": 0.03695980128098824, "acc_norm": 0.7933884297520661, "acc_norm_stderr": 0.03695980128098824 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7870370370370371, "acc_stderr": 0.0395783547198098, "acc_norm": 0.7870370370370371, "acc_norm_stderr": 0.0395783547198098 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7668711656441718, "acc_stderr": 0.0332201579577674, "acc_norm": 0.7668711656441718, "acc_norm_stderr": 0.0332201579577674 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.44642857142857145, "acc_stderr": 0.047184714852195886, "acc_norm": 0.44642857142857145, "acc_norm_stderr": 0.047184714852195886 }, "harness|hendrycksTest-management|5": { "acc": 0.7572815533980582, "acc_stderr": 0.04245022486384495, "acc_norm": 0.7572815533980582, "acc_norm_stderr": 0.04245022486384495 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8760683760683761, "acc_stderr": 0.021586494001281376, "acc_norm": 0.8760683760683761, "acc_norm_stderr": 0.021586494001281376 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.71, "acc_stderr": 0.045604802157206845, "acc_norm": 0.71, "acc_norm_stderr": 0.045604802157206845 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8301404853128991, "acc_stderr": 0.013428186370608304, "acc_norm": 0.8301404853128991, "acc_norm_stderr": 0.013428186370608304 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.7341040462427746, "acc_stderr": 0.02378620325550829, "acc_norm": 0.7341040462427746, "acc_norm_stderr": 0.02378620325550829 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.4011173184357542, "acc_stderr": 0.01639222189940708, "acc_norm": 0.4011173184357542, "acc_norm_stderr": 0.01639222189940708 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.738562091503268, "acc_stderr": 0.025160998214292452, "acc_norm": 0.738562091503268, "acc_norm_stderr": 0.025160998214292452 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.707395498392283, "acc_stderr": 0.02583989833487798, "acc_norm": 0.707395498392283, "acc_norm_stderr": 0.02583989833487798 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.7530864197530864, "acc_stderr": 0.02399350170904211, "acc_norm": 0.7530864197530864, "acc_norm_stderr": 0.02399350170904211 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.4858156028368794, "acc_stderr": 0.02981549448368206, "acc_norm": 0.4858156028368794, "acc_norm_stderr": 0.02981549448368206 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.46870925684485004, "acc_stderr": 0.01274520462608314, "acc_norm": 0.46870925684485004, "acc_norm_stderr": 0.01274520462608314 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.6948529411764706, "acc_stderr": 0.0279715413701706, "acc_norm": 0.6948529411764706, "acc_norm_stderr": 0.0279715413701706 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6830065359477124, "acc_stderr": 0.018824219512706207, "acc_norm": 0.6830065359477124, "acc_norm_stderr": 0.018824219512706207 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6636363636363637, "acc_stderr": 0.04525393596302506, "acc_norm": 0.6636363636363637, "acc_norm_stderr": 0.04525393596302506 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.7306122448979592, "acc_stderr": 0.02840125202902294, "acc_norm": 0.7306122448979592, "acc_norm_stderr": 0.02840125202902294 }, "harness|hendrycksTest-sociology|5": { "acc": 0.845771144278607, "acc_stderr": 0.02553843336857833, "acc_norm": 0.845771144278607, "acc_norm_stderr": 0.02553843336857833 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.85, "acc_stderr": 0.0358870281282637, "acc_norm": 0.85, "acc_norm_stderr": 0.0358870281282637 }, "harness|hendrycksTest-virology|5": { "acc": 0.5542168674698795, "acc_stderr": 0.03869543323472101, "acc_norm": 0.5542168674698795, "acc_norm_stderr": 0.03869543323472101 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8362573099415205, "acc_stderr": 0.028380919596145866, "acc_norm": 0.8362573099415205, "acc_norm_stderr": 0.028380919596145866 }, "harness|truthfulqa:mc|0": { "mc1": 0.4602203182374541, "mc1_stderr": 0.01744801722396088, "mc2": 0.6205565135867297, "mc2_stderr": 0.015134580676846265 }, "harness|winogrande|5": { "acc": 0.813733228097869, "acc_stderr": 0.01094187795567621 }, "harness|gsm8k|5": { "acc": 0.7210007581501138, "acc_stderr": 0.012354115779970308 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_ConvexAI__Harmony-4x7B-bf16
[ "region:us" ]
2024-02-02T02:10:25+00:00
{"pretty_name": "Evaluation run of ConvexAI/Harmony-4x7B-bf16", "dataset_summary": "Dataset automatically created during the evaluation run of model [ConvexAI/Harmony-4x7B-bf16](https://huggingface.co/ConvexAI/Harmony-4x7B-bf16) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_ConvexAI__Harmony-4x7B-bf16\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-02T02:08:05.844408](https://huggingface.co/datasets/open-llm-leaderboard/details_ConvexAI__Harmony-4x7B-bf16/blob/main/results_2024-02-02T02-08-05.844408.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6519815241664966,\n \"acc_stderr\": 0.03207888973753625,\n \"acc_norm\": 0.6516491536204881,\n \"acc_norm_stderr\": 0.03274551770079104,\n \"mc1\": 0.4602203182374541,\n \"mc1_stderr\": 0.01744801722396088,\n \"mc2\": 0.6205565135867297,\n \"mc2_stderr\": 0.015134580676846265\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.658703071672355,\n \"acc_stderr\": 0.013855831287497724,\n \"acc_norm\": 0.6834470989761092,\n \"acc_norm_stderr\": 0.013592431519068077\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6810396335391357,\n \"acc_stderr\": 0.004651211311633843,\n \"acc_norm\": 0.8674566819358693,\n \"acc_norm_stderr\": 0.0033838751726700243\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6370370370370371,\n \"acc_stderr\": 0.04153948404742398,\n \"acc_norm\": 0.6370370370370371,\n \"acc_norm_stderr\": 0.04153948404742398\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7039473684210527,\n \"acc_stderr\": 0.037150621549989056,\n \"acc_norm\": 0.7039473684210527,\n \"acc_norm_stderr\": 0.037150621549989056\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.62,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.62,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7056603773584905,\n \"acc_stderr\": 0.02804918631569525,\n \"acc_norm\": 0.7056603773584905,\n \"acc_norm_stderr\": 0.02804918631569525\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.03476590104304134,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.03476590104304134\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6763005780346821,\n \"acc_stderr\": 0.0356760379963917,\n \"acc_norm\": 0.6763005780346821,\n \"acc_norm_stderr\": 0.0356760379963917\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.04897104952726366,\n \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.04897104952726366\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.04292346959909282,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.04292346959909282\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5829787234042553,\n \"acc_stderr\": 0.03223276266711712,\n \"acc_norm\": 0.5829787234042553,\n \"acc_norm_stderr\": 0.03223276266711712\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.49122807017543857,\n \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.49122807017543857,\n \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5379310344827586,\n \"acc_stderr\": 0.04154659671707548,\n \"acc_norm\": 0.5379310344827586,\n \"acc_norm_stderr\": 0.04154659671707548\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.42857142857142855,\n \"acc_stderr\": 0.025487187147859375,\n \"acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.025487187147859375\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.46825396825396826,\n \"acc_stderr\": 0.04463112720677172,\n \"acc_norm\": 0.46825396825396826,\n \"acc_norm_stderr\": 0.04463112720677172\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7709677419354839,\n \"acc_stderr\": 0.02390491431178265,\n \"acc_norm\": 0.7709677419354839,\n \"acc_norm_stderr\": 0.02390491431178265\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.47783251231527096,\n \"acc_stderr\": 0.03514528562175008,\n \"acc_norm\": 0.47783251231527096,\n \"acc_norm_stderr\": 0.03514528562175008\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.031922715695483,\n \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.031922715695483\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.029126522834586815,\n \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.029126522834586815\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9015544041450777,\n \"acc_stderr\": 0.021500249576033456,\n \"acc_norm\": 0.9015544041450777,\n \"acc_norm_stderr\": 0.021500249576033456\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.023901157979402534,\n \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.023901157979402534\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.37777777777777777,\n \"acc_stderr\": 0.02956070739246572,\n \"acc_norm\": 0.37777777777777777,\n \"acc_norm_stderr\": 0.02956070739246572\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6974789915966386,\n \"acc_stderr\": 0.02983796238829193,\n \"acc_norm\": 0.6974789915966386,\n \"acc_norm_stderr\": 0.02983796238829193\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.33774834437086093,\n \"acc_stderr\": 0.03861557546255169,\n \"acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.03861557546255169\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8477064220183487,\n \"acc_stderr\": 0.015405084393157074,\n \"acc_norm\": 0.8477064220183487,\n \"acc_norm_stderr\": 0.015405084393157074\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5370370370370371,\n \"acc_stderr\": 0.03400603625538272,\n \"acc_norm\": 0.5370370370370371,\n \"acc_norm_stderr\": 0.03400603625538272\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8186274509803921,\n \"acc_stderr\": 0.027044621719474086,\n \"acc_norm\": 0.8186274509803921,\n \"acc_norm_stderr\": 0.027044621719474086\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8016877637130801,\n \"acc_stderr\": 0.02595502084162113,\n \"acc_norm\": 0.8016877637130801,\n \"acc_norm_stderr\": 0.02595502084162113\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n \"acc_stderr\": 0.03102441174057221,\n \"acc_norm\": 0.6905829596412556,\n \"acc_norm_stderr\": 0.03102441174057221\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8091603053435115,\n \"acc_stderr\": 0.034465133507525995,\n \"acc_norm\": 0.8091603053435115,\n \"acc_norm_stderr\": 0.034465133507525995\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7933884297520661,\n \"acc_stderr\": 0.03695980128098824,\n \"acc_norm\": 0.7933884297520661,\n \"acc_norm_stderr\": 0.03695980128098824\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n \"acc_stderr\": 0.0395783547198098,\n \"acc_norm\": 0.7870370370370371,\n \"acc_norm_stderr\": 0.0395783547198098\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7668711656441718,\n \"acc_stderr\": 0.0332201579577674,\n \"acc_norm\": 0.7668711656441718,\n \"acc_norm_stderr\": 0.0332201579577674\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.44642857142857145,\n \"acc_stderr\": 0.047184714852195886,\n \"acc_norm\": 0.44642857142857145,\n \"acc_norm_stderr\": 0.047184714852195886\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384495,\n \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384495\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8760683760683761,\n \"acc_stderr\": 0.021586494001281376,\n \"acc_norm\": 0.8760683760683761,\n \"acc_norm_stderr\": 0.021586494001281376\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8301404853128991,\n \"acc_stderr\": 0.013428186370608304,\n \"acc_norm\": 0.8301404853128991,\n \"acc_norm_stderr\": 0.013428186370608304\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7341040462427746,\n \"acc_stderr\": 0.02378620325550829,\n \"acc_norm\": 0.7341040462427746,\n \"acc_norm_stderr\": 0.02378620325550829\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4011173184357542,\n \"acc_stderr\": 0.01639222189940708,\n \"acc_norm\": 0.4011173184357542,\n \"acc_norm_stderr\": 0.01639222189940708\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.738562091503268,\n \"acc_stderr\": 0.025160998214292452,\n \"acc_norm\": 0.738562091503268,\n \"acc_norm_stderr\": 0.025160998214292452\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.707395498392283,\n \"acc_stderr\": 0.02583989833487798,\n \"acc_norm\": 0.707395498392283,\n \"acc_norm_stderr\": 0.02583989833487798\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7530864197530864,\n \"acc_stderr\": 0.02399350170904211,\n \"acc_norm\": 0.7530864197530864,\n \"acc_norm_stderr\": 0.02399350170904211\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4858156028368794,\n \"acc_stderr\": 0.02981549448368206,\n \"acc_norm\": 0.4858156028368794,\n \"acc_norm_stderr\": 0.02981549448368206\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46870925684485004,\n \"acc_stderr\": 0.01274520462608314,\n \"acc_norm\": 0.46870925684485004,\n \"acc_norm_stderr\": 0.01274520462608314\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6948529411764706,\n \"acc_stderr\": 0.0279715413701706,\n \"acc_norm\": 0.6948529411764706,\n \"acc_norm_stderr\": 0.0279715413701706\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6830065359477124,\n \"acc_stderr\": 0.018824219512706207,\n \"acc_norm\": 0.6830065359477124,\n \"acc_norm_stderr\": 0.018824219512706207\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7306122448979592,\n \"acc_stderr\": 0.02840125202902294,\n \"acc_norm\": 0.7306122448979592,\n \"acc_norm_stderr\": 0.02840125202902294\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.845771144278607,\n \"acc_stderr\": 0.02553843336857833,\n \"acc_norm\": 0.845771144278607,\n \"acc_norm_stderr\": 0.02553843336857833\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.85,\n \"acc_stderr\": 0.0358870281282637,\n \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.0358870281282637\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5542168674698795,\n \"acc_stderr\": 0.03869543323472101,\n \"acc_norm\": 0.5542168674698795,\n \"acc_norm_stderr\": 0.03869543323472101\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4602203182374541,\n \"mc1_stderr\": 0.01744801722396088,\n \"mc2\": 0.6205565135867297,\n \"mc2_stderr\": 0.015134580676846265\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.813733228097869,\n \"acc_stderr\": 0.01094187795567621\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7210007581501138,\n \"acc_stderr\": 0.012354115779970308\n }\n}\n```", "repo_url": "https://huggingface.co/ConvexAI/Harmony-4x7B-bf16", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_02T02_08_05.844408", "path": ["**/details_harness|arc:challenge|25_2024-02-02T02-08-05.844408.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-02T02-08-05.844408.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_02T02_08_05.844408", "path": ["**/details_harness|gsm8k|5_2024-02-02T02-08-05.844408.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-02T02-08-05.844408.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_02T02_08_05.844408", "path": ["**/details_harness|hellaswag|10_2024-02-02T02-08-05.844408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-02T02-08-05.844408.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_02T02_08_05.844408", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T02-08-05.844408.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-02T02-08-05.844408.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-02T02-08-05.844408.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T02-08-05.844408.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T02-08-05.844408.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-02T02-08-05.844408.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T02-08-05.844408.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T02-08-05.844408.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T02-08-05.844408.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T02-08-05.844408.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-02T02-08-05.844408.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-02T02-08-05.844408.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T02-08-05.844408.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-02T02-08-05.844408.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T02-08-05.844408.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T02-08-05.844408.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T02-08-05.844408.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-02T02-08-05.844408.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T02-08-05.844408.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T02-08-05.844408.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T02-08-05.844408.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T02-08-05.844408.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T02-08-05.844408.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T02-08-05.844408.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T02-08-05.844408.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T02-08-05.844408.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T02-08-05.844408.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T02-08-05.844408.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T02-08-05.844408.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T02-08-05.844408.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T02-08-05.844408.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T02-08-05.844408.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-02T02-08-05.844408.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T02-08-05.844408.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-02T02-08-05.844408.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T02-08-05.844408.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T02-08-05.844408.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T02-08-05.844408.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-02T02-08-05.844408.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-02T02-08-05.844408.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T02-08-05.844408.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T02-08-05.844408.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T02-08-05.844408.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T02-08-05.844408.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-02T02-08-05.844408.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-02T02-08-05.844408.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-02T02-08-05.844408.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T02-08-05.844408.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-02T02-08-05.844408.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T02-08-05.844408.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T02-08-05.844408.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-02T02-08-05.844408.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-02T02-08-05.844408.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-02T02-08-05.844408.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T02-08-05.844408.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-02T02-08-05.844408.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-02T02-08-05.844408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T02-08-05.844408.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-02T02-08-05.844408.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-02T02-08-05.844408.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T02-08-05.844408.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T02-08-05.844408.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-02T02-08-05.844408.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T02-08-05.844408.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T02-08-05.844408.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T02-08-05.844408.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T02-08-05.844408.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-02T02-08-05.844408.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-02T02-08-05.844408.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T02-08-05.844408.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-02T02-08-05.844408.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T02-08-05.844408.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T02-08-05.844408.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T02-08-05.844408.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-02T02-08-05.844408.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T02-08-05.844408.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T02-08-05.844408.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T02-08-05.844408.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T02-08-05.844408.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T02-08-05.844408.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T02-08-05.844408.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T02-08-05.844408.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T02-08-05.844408.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T02-08-05.844408.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T02-08-05.844408.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T02-08-05.844408.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T02-08-05.844408.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T02-08-05.844408.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T02-08-05.844408.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-02T02-08-05.844408.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T02-08-05.844408.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-02T02-08-05.844408.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T02-08-05.844408.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T02-08-05.844408.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T02-08-05.844408.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-02T02-08-05.844408.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-02T02-08-05.844408.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T02-08-05.844408.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T02-08-05.844408.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T02-08-05.844408.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T02-08-05.844408.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-02T02-08-05.844408.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-02T02-08-05.844408.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-02T02-08-05.844408.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T02-08-05.844408.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-02T02-08-05.844408.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T02-08-05.844408.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T02-08-05.844408.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-02T02-08-05.844408.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-02T02-08-05.844408.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-02T02-08-05.844408.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T02-08-05.844408.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-02T02-08-05.844408.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-02T02-08-05.844408.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_02T02_08_05.844408", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T02-08-05.844408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T02-08-05.844408.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_02T02_08_05.844408", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-02T02-08-05.844408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-02T02-08-05.844408.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_02T02_08_05.844408", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-02T02-08-05.844408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-02T02-08-05.844408.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_02T02_08_05.844408", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T02-08-05.844408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T02-08-05.844408.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_02T02_08_05.844408", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T02-08-05.844408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T02-08-05.844408.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_02T02_08_05.844408", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-02T02-08-05.844408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-02T02-08-05.844408.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_02T02_08_05.844408", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T02-08-05.844408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T02-08-05.844408.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_02T02_08_05.844408", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T02-08-05.844408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T02-08-05.844408.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_02T02_08_05.844408", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T02-08-05.844408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T02-08-05.844408.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_02T02_08_05.844408", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T02-08-05.844408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T02-08-05.844408.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_02T02_08_05.844408", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-02T02-08-05.844408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-02T02-08-05.844408.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_02T02_08_05.844408", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-02T02-08-05.844408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-02T02-08-05.844408.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_02T02_08_05.844408", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T02-08-05.844408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T02-08-05.844408.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_02T02_08_05.844408", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-02T02-08-05.844408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-02T02-08-05.844408.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_02T02_08_05.844408", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T02-08-05.844408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T02-08-05.844408.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_02T02_08_05.844408", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T02-08-05.844408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T02-08-05.844408.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_02T02_08_05.844408", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T02-08-05.844408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T02-08-05.844408.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_02T02_08_05.844408", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-02T02-08-05.844408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-02T02-08-05.844408.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_02T02_08_05.844408", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T02-08-05.844408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T02-08-05.844408.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_02T02_08_05.844408", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T02-08-05.844408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T02-08-05.844408.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_02T02_08_05.844408", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T02-08-05.844408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T02-08-05.844408.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_02T02_08_05.844408", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T02-08-05.844408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T02-08-05.844408.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_02T02_08_05.844408", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T02-08-05.844408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T02-08-05.844408.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_02T02_08_05.844408", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T02-08-05.844408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T02-08-05.844408.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_02T02_08_05.844408", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T02-08-05.844408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T02-08-05.844408.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_02T02_08_05.844408", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T02-08-05.844408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T02-08-05.844408.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_02T02_08_05.844408", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T02-08-05.844408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T02-08-05.844408.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_02T02_08_05.844408", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T02-08-05.844408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T02-08-05.844408.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_02T02_08_05.844408", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T02-08-05.844408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T02-08-05.844408.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_02T02_08_05.844408", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T02-08-05.844408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T02-08-05.844408.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_02T02_08_05.844408", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T02-08-05.844408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T02-08-05.844408.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_02T02_08_05.844408", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T02-08-05.844408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T02-08-05.844408.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_02T02_08_05.844408", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-02T02-08-05.844408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-02T02-08-05.844408.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_02T02_08_05.844408", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T02-08-05.844408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T02-08-05.844408.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_02T02_08_05.844408", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-02T02-08-05.844408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-02T02-08-05.844408.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_02T02_08_05.844408", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T02-08-05.844408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T02-08-05.844408.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_02T02_08_05.844408", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T02-08-05.844408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T02-08-05.844408.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_02T02_08_05.844408", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T02-08-05.844408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T02-08-05.844408.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_02T02_08_05.844408", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-02T02-08-05.844408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-02T02-08-05.844408.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_02T02_08_05.844408", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-02T02-08-05.844408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-02T02-08-05.844408.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_02T02_08_05.844408", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T02-08-05.844408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T02-08-05.844408.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_02T02_08_05.844408", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T02-08-05.844408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T02-08-05.844408.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_02T02_08_05.844408", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T02-08-05.844408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T02-08-05.844408.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_02T02_08_05.844408", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T02-08-05.844408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T02-08-05.844408.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_02T02_08_05.844408", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-02T02-08-05.844408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-02T02-08-05.844408.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_02T02_08_05.844408", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-02T02-08-05.844408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-02T02-08-05.844408.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_02T02_08_05.844408", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-02T02-08-05.844408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-02T02-08-05.844408.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_02T02_08_05.844408", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T02-08-05.844408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T02-08-05.844408.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_02T02_08_05.844408", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-02T02-08-05.844408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-02T02-08-05.844408.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_02T02_08_05.844408", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T02-08-05.844408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T02-08-05.844408.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_02T02_08_05.844408", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T02-08-05.844408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T02-08-05.844408.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_02T02_08_05.844408", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-02T02-08-05.844408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-02T02-08-05.844408.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_02T02_08_05.844408", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-02T02-08-05.844408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-02T02-08-05.844408.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_02T02_08_05.844408", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-02T02-08-05.844408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-02T02-08-05.844408.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_02T02_08_05.844408", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T02-08-05.844408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T02-08-05.844408.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_02T02_08_05.844408", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-02T02-08-05.844408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-02T02-08-05.844408.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_02T02_08_05.844408", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-02T02-08-05.844408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-02T02-08-05.844408.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_02T02_08_05.844408", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-02T02-08-05.844408.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-02T02-08-05.844408.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_02T02_08_05.844408", "path": ["**/details_harness|winogrande|5_2024-02-02T02-08-05.844408.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-02T02-08-05.844408.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_02T02_08_05.844408", "path": ["results_2024-02-02T02-08-05.844408.parquet"]}, {"split": "latest", "path": ["results_2024-02-02T02-08-05.844408.parquet"]}]}]}
2024-02-02T02:10:51+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of ConvexAI/Harmony-4x7B-bf16 Dataset automatically created during the evaluation run of model ConvexAI/Harmony-4x7B-bf16 on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-02-02T02:08:05.844408(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of ConvexAI/Harmony-4x7B-bf16\n\n\n\nDataset automatically created during the evaluation run of model ConvexAI/Harmony-4x7B-bf16 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-02-02T02:08:05.844408(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of ConvexAI/Harmony-4x7B-bf16\n\n\n\nDataset automatically created during the evaluation run of model ConvexAI/Harmony-4x7B-bf16 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-02-02T02:08:05.844408(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
d26e1f7c5aecb2fba5e4bbafb5f2a405b4e9cc04
# Dataset Card for Evaluation run of huseyinatahaninan/phi-2-instruction <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [huseyinatahaninan/phi-2-instruction](https://huggingface.co/huseyinatahaninan/phi-2-instruction) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_huseyinatahaninan__phi-2-instruction", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-02-02T15:24:48.526289](https://huggingface.co/datasets/open-llm-leaderboard/details_huseyinatahaninan__phi-2-instruction/blob/main/results_2024-02-02T15-24-48.526289.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.5792594075683091, "acc_stderr": 0.03371661822949884, "acc_norm": 0.5811321590895305, "acc_norm_stderr": 0.03440439830365177, "mc1": 0.30966952264381886, "mc1_stderr": 0.016185744355144912, "mc2": 0.44956524421338884, "mc2_stderr": 0.015113609603273521 }, "harness|arc:challenge|25": { "acc": 0.5819112627986348, "acc_stderr": 0.014413988396996083, "acc_norm": 0.613481228668942, "acc_norm_stderr": 0.014230084761910481 }, "harness|hellaswag|10": { "acc": 0.5595498904600678, "acc_stderr": 0.0049542655953734634, "acc_norm": 0.7472615016928899, "acc_norm_stderr": 0.004336941069568736 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.28, "acc_stderr": 0.04512608598542128, "acc_norm": 0.28, "acc_norm_stderr": 0.04512608598542128 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.43703703703703706, "acc_stderr": 0.04284958639753399, "acc_norm": 0.43703703703703706, "acc_norm_stderr": 0.04284958639753399 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.6118421052631579, "acc_stderr": 0.03965842097512744, "acc_norm": 0.6118421052631579, "acc_norm_stderr": 0.03965842097512744 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.58, "acc_stderr": 0.049604496374885836, "acc_norm": 0.58, "acc_norm_stderr": 0.049604496374885836 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.6037735849056604, "acc_stderr": 0.030102793781791197, "acc_norm": 0.6037735849056604, "acc_norm_stderr": 0.030102793781791197 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.6597222222222222, "acc_stderr": 0.039621355734862175, "acc_norm": 0.6597222222222222, "acc_norm_stderr": 0.039621355734862175 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.38, "acc_stderr": 0.04878317312145633, "acc_norm": 0.38, "acc_norm_stderr": 0.04878317312145633 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.44, "acc_stderr": 0.04988876515698589, "acc_norm": 0.44, "acc_norm_stderr": 0.04988876515698589 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.39, "acc_stderr": 0.04902071300001975, "acc_norm": 0.39, "acc_norm_stderr": 0.04902071300001975 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.5780346820809249, "acc_stderr": 0.0376574669386515, "acc_norm": 0.5780346820809249, "acc_norm_stderr": 0.0376574669386515 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.3431372549019608, "acc_stderr": 0.04724007352383888, "acc_norm": 0.3431372549019608, "acc_norm_stderr": 0.04724007352383888 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.75, "acc_stderr": 0.04351941398892446, "acc_norm": 0.75, "acc_norm_stderr": 0.04351941398892446 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.502127659574468, "acc_stderr": 0.03268572658667493, "acc_norm": 0.502127659574468, "acc_norm_stderr": 0.03268572658667493 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.39473684210526316, "acc_stderr": 0.045981880578165414, "acc_norm": 0.39473684210526316, "acc_norm_stderr": 0.045981880578165414 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5448275862068965, "acc_stderr": 0.04149886942192118, "acc_norm": 0.5448275862068965, "acc_norm_stderr": 0.04149886942192118 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.42328042328042326, "acc_stderr": 0.02544636563440679, "acc_norm": 0.42328042328042326, "acc_norm_stderr": 0.02544636563440679 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.38095238095238093, "acc_stderr": 0.043435254289490965, "acc_norm": 0.38095238095238093, "acc_norm_stderr": 0.043435254289490965 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.37, "acc_stderr": 0.04852365870939099, "acc_norm": 0.37, "acc_norm_stderr": 0.04852365870939099 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.7161290322580646, "acc_stderr": 0.02564938106302926, "acc_norm": 0.7161290322580646, "acc_norm_stderr": 0.02564938106302926 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.4827586206896552, "acc_stderr": 0.035158955511656986, "acc_norm": 0.4827586206896552, "acc_norm_stderr": 0.035158955511656986 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.65, "acc_stderr": 0.047937248544110196, "acc_norm": 0.65, "acc_norm_stderr": 0.047937248544110196 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.6606060606060606, "acc_stderr": 0.036974422050315967, "acc_norm": 0.6606060606060606, "acc_norm_stderr": 0.036974422050315967 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.7272727272727273, "acc_stderr": 0.03173071239071724, "acc_norm": 0.7272727272727273, "acc_norm_stderr": 0.03173071239071724 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.8186528497409327, "acc_stderr": 0.027807032360686088, "acc_norm": 0.8186528497409327, "acc_norm_stderr": 0.027807032360686088 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.5743589743589743, "acc_stderr": 0.02506909438729653, "acc_norm": 0.5743589743589743, "acc_norm_stderr": 0.02506909438729653 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.31851851851851853, "acc_stderr": 0.028406533090608463, "acc_norm": 0.31851851851851853, "acc_norm_stderr": 0.028406533090608463 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.634453781512605, "acc_stderr": 0.03128217706368461, "acc_norm": 0.634453781512605, "acc_norm_stderr": 0.03128217706368461 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.3576158940397351, "acc_stderr": 0.03913453431177258, "acc_norm": 0.3576158940397351, "acc_norm_stderr": 0.03913453431177258 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.7926605504587156, "acc_stderr": 0.017381415563608674, "acc_norm": 0.7926605504587156, "acc_norm_stderr": 0.017381415563608674 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.4861111111111111, "acc_stderr": 0.03408655867977748, "acc_norm": 0.4861111111111111, "acc_norm_stderr": 0.03408655867977748 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.6470588235294118, "acc_stderr": 0.03354092437591518, "acc_norm": 0.6470588235294118, "acc_norm_stderr": 0.03354092437591518 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.7130801687763713, "acc_stderr": 0.029443773022594693, "acc_norm": 0.7130801687763713, "acc_norm_stderr": 0.029443773022594693 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.672645739910314, "acc_stderr": 0.03149384670994131, "acc_norm": 0.672645739910314, "acc_norm_stderr": 0.03149384670994131 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.648854961832061, "acc_stderr": 0.04186445163013751, "acc_norm": 0.648854961832061, "acc_norm_stderr": 0.04186445163013751 }, "harness|hendrycksTest-international_law|5": { "acc": 0.7355371900826446, "acc_stderr": 0.040261875275912046, "acc_norm": 0.7355371900826446, "acc_norm_stderr": 0.040261875275912046 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7592592592592593, "acc_stderr": 0.04133119440243838, "acc_norm": 0.7592592592592593, "acc_norm_stderr": 0.04133119440243838 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7300613496932515, "acc_stderr": 0.034878251684978906, "acc_norm": 0.7300613496932515, "acc_norm_stderr": 0.034878251684978906 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.49107142857142855, "acc_stderr": 0.04745033255489123, "acc_norm": 0.49107142857142855, "acc_norm_stderr": 0.04745033255489123 }, "harness|hendrycksTest-management|5": { "acc": 0.7378640776699029, "acc_stderr": 0.04354631077260594, "acc_norm": 0.7378640776699029, "acc_norm_stderr": 0.04354631077260594 }, "harness|hendrycksTest-marketing|5": { "acc": 0.7991452991452992, "acc_stderr": 0.026246772946890484, "acc_norm": 0.7991452991452992, "acc_norm_stderr": 0.026246772946890484 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.65, "acc_stderr": 0.0479372485441102, "acc_norm": 0.65, "acc_norm_stderr": 0.0479372485441102 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.6973180076628352, "acc_stderr": 0.01642878158174936, "acc_norm": 0.6973180076628352, "acc_norm_stderr": 0.01642878158174936 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.6560693641618497, "acc_stderr": 0.025574123786546672, "acc_norm": 0.6560693641618497, "acc_norm_stderr": 0.025574123786546672 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.27262569832402234, "acc_stderr": 0.014893391735249622, "acc_norm": 0.27262569832402234, "acc_norm_stderr": 0.014893391735249622 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.6209150326797386, "acc_stderr": 0.027780141207023344, "acc_norm": 0.6209150326797386, "acc_norm_stderr": 0.027780141207023344 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.6463022508038585, "acc_stderr": 0.027155208103200868, "acc_norm": 0.6463022508038585, "acc_norm_stderr": 0.027155208103200868 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.6111111111111112, "acc_stderr": 0.027125115513166848, "acc_norm": 0.6111111111111112, "acc_norm_stderr": 0.027125115513166848 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.425531914893617, "acc_stderr": 0.02949482760014437, "acc_norm": 0.425531914893617, "acc_norm_stderr": 0.02949482760014437 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.4282920469361147, "acc_stderr": 0.012638223880313161, "acc_norm": 0.4282920469361147, "acc_norm_stderr": 0.012638223880313161 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.45588235294117646, "acc_stderr": 0.03025437257397669, "acc_norm": 0.45588235294117646, "acc_norm_stderr": 0.03025437257397669 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.5588235294117647, "acc_stderr": 0.020087362076702853, "acc_norm": 0.5588235294117647, "acc_norm_stderr": 0.020087362076702853 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6636363636363637, "acc_stderr": 0.04525393596302505, "acc_norm": 0.6636363636363637, "acc_norm_stderr": 0.04525393596302505 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.7061224489795919, "acc_stderr": 0.02916273841024977, "acc_norm": 0.7061224489795919, "acc_norm_stderr": 0.02916273841024977 }, "harness|hendrycksTest-sociology|5": { "acc": 0.8059701492537313, "acc_stderr": 0.027962677604768928, "acc_norm": 0.8059701492537313, "acc_norm_stderr": 0.027962677604768928 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.75, "acc_stderr": 0.04351941398892446, "acc_norm": 0.75, "acc_norm_stderr": 0.04351941398892446 }, "harness|hendrycksTest-virology|5": { "acc": 0.4759036144578313, "acc_stderr": 0.038879718495972646, "acc_norm": 0.4759036144578313, "acc_norm_stderr": 0.038879718495972646 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.6783625730994152, "acc_stderr": 0.03582529442573122, "acc_norm": 0.6783625730994152, "acc_norm_stderr": 0.03582529442573122 }, "harness|truthfulqa:mc|0": { "mc1": 0.30966952264381886, "mc1_stderr": 0.016185744355144912, "mc2": 0.44956524421338884, "mc2_stderr": 0.015113609603273521 }, "harness|winogrande|5": { "acc": 0.7419100236779794, "acc_stderr": 0.01229827883397239 }, "harness|gsm8k|5": { "acc": 0.5253980288097043, "acc_stderr": 0.013754705089112314 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_huseyinatahaninan__phi-2-instruction
[ "region:us" ]
2024-02-02T02:12:29+00:00
{"pretty_name": "Evaluation run of huseyinatahaninan/phi-2-instruction", "dataset_summary": "Dataset automatically created during the evaluation run of model [huseyinatahaninan/phi-2-instruction](https://huggingface.co/huseyinatahaninan/phi-2-instruction) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_huseyinatahaninan__phi-2-instruction\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-02T15:24:48.526289](https://huggingface.co/datasets/open-llm-leaderboard/details_huseyinatahaninan__phi-2-instruction/blob/main/results_2024-02-02T15-24-48.526289.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5792594075683091,\n \"acc_stderr\": 0.03371661822949884,\n \"acc_norm\": 0.5811321590895305,\n \"acc_norm_stderr\": 0.03440439830365177,\n \"mc1\": 0.30966952264381886,\n \"mc1_stderr\": 0.016185744355144912,\n \"mc2\": 0.44956524421338884,\n \"mc2_stderr\": 0.015113609603273521\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5819112627986348,\n \"acc_stderr\": 0.014413988396996083,\n \"acc_norm\": 0.613481228668942,\n \"acc_norm_stderr\": 0.014230084761910481\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5595498904600678,\n \"acc_stderr\": 0.0049542655953734634,\n \"acc_norm\": 0.7472615016928899,\n \"acc_norm_stderr\": 0.004336941069568736\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.43703703703703706,\n \"acc_stderr\": 0.04284958639753399,\n \"acc_norm\": 0.43703703703703706,\n \"acc_norm_stderr\": 0.04284958639753399\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6118421052631579,\n \"acc_stderr\": 0.03965842097512744,\n \"acc_norm\": 0.6118421052631579,\n \"acc_norm_stderr\": 0.03965842097512744\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6037735849056604,\n \"acc_stderr\": 0.030102793781791197,\n \"acc_norm\": 0.6037735849056604,\n \"acc_norm_stderr\": 0.030102793781791197\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6597222222222222,\n \"acc_stderr\": 0.039621355734862175,\n \"acc_norm\": 0.6597222222222222,\n \"acc_norm_stderr\": 0.039621355734862175\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145633,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145633\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5780346820809249,\n \"acc_stderr\": 0.0376574669386515,\n \"acc_norm\": 0.5780346820809249,\n \"acc_norm_stderr\": 0.0376574669386515\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.3431372549019608,\n \"acc_stderr\": 0.04724007352383888,\n \"acc_norm\": 0.3431372549019608,\n \"acc_norm_stderr\": 0.04724007352383888\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.502127659574468,\n \"acc_stderr\": 0.03268572658667493,\n \"acc_norm\": 0.502127659574468,\n \"acc_norm_stderr\": 0.03268572658667493\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.39473684210526316,\n \"acc_stderr\": 0.045981880578165414,\n \"acc_norm\": 0.39473684210526316,\n \"acc_norm_stderr\": 0.045981880578165414\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5448275862068965,\n \"acc_stderr\": 0.04149886942192118,\n \"acc_norm\": 0.5448275862068965,\n \"acc_norm_stderr\": 0.04149886942192118\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.42328042328042326,\n \"acc_stderr\": 0.02544636563440679,\n \"acc_norm\": 0.42328042328042326,\n \"acc_norm_stderr\": 0.02544636563440679\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.38095238095238093,\n \"acc_stderr\": 0.043435254289490965,\n \"acc_norm\": 0.38095238095238093,\n \"acc_norm_stderr\": 0.043435254289490965\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7161290322580646,\n \"acc_stderr\": 0.02564938106302926,\n \"acc_norm\": 0.7161290322580646,\n \"acc_norm_stderr\": 0.02564938106302926\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4827586206896552,\n \"acc_stderr\": 0.035158955511656986,\n \"acc_norm\": 0.4827586206896552,\n \"acc_norm_stderr\": 0.035158955511656986\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.65,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.6606060606060606,\n \"acc_stderr\": 0.036974422050315967,\n \"acc_norm\": 0.6606060606060606,\n \"acc_norm_stderr\": 0.036974422050315967\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7272727272727273,\n \"acc_stderr\": 0.03173071239071724,\n \"acc_norm\": 0.7272727272727273,\n \"acc_norm_stderr\": 0.03173071239071724\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8186528497409327,\n \"acc_stderr\": 0.027807032360686088,\n \"acc_norm\": 0.8186528497409327,\n \"acc_norm_stderr\": 0.027807032360686088\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.5743589743589743,\n \"acc_stderr\": 0.02506909438729653,\n \"acc_norm\": 0.5743589743589743,\n \"acc_norm_stderr\": 0.02506909438729653\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.31851851851851853,\n \"acc_stderr\": 0.028406533090608463,\n \"acc_norm\": 0.31851851851851853,\n \"acc_norm_stderr\": 0.028406533090608463\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.634453781512605,\n \"acc_stderr\": 0.03128217706368461,\n \"acc_norm\": 0.634453781512605,\n \"acc_norm_stderr\": 0.03128217706368461\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3576158940397351,\n \"acc_stderr\": 0.03913453431177258,\n \"acc_norm\": 0.3576158940397351,\n \"acc_norm_stderr\": 0.03913453431177258\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.7926605504587156,\n \"acc_stderr\": 0.017381415563608674,\n \"acc_norm\": 0.7926605504587156,\n \"acc_norm_stderr\": 0.017381415563608674\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4861111111111111,\n \"acc_stderr\": 0.03408655867977748,\n \"acc_norm\": 0.4861111111111111,\n \"acc_norm_stderr\": 0.03408655867977748\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.6470588235294118,\n \"acc_stderr\": 0.03354092437591518,\n \"acc_norm\": 0.6470588235294118,\n \"acc_norm_stderr\": 0.03354092437591518\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7130801687763713,\n \"acc_stderr\": 0.029443773022594693,\n \"acc_norm\": 0.7130801687763713,\n \"acc_norm_stderr\": 0.029443773022594693\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.672645739910314,\n \"acc_stderr\": 0.03149384670994131,\n \"acc_norm\": 0.672645739910314,\n \"acc_norm_stderr\": 0.03149384670994131\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.648854961832061,\n \"acc_stderr\": 0.04186445163013751,\n \"acc_norm\": 0.648854961832061,\n \"acc_norm_stderr\": 0.04186445163013751\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7355371900826446,\n \"acc_stderr\": 0.040261875275912046,\n \"acc_norm\": 0.7355371900826446,\n \"acc_norm_stderr\": 0.040261875275912046\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7592592592592593,\n \"acc_stderr\": 0.04133119440243838,\n \"acc_norm\": 0.7592592592592593,\n \"acc_norm_stderr\": 0.04133119440243838\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7300613496932515,\n \"acc_stderr\": 0.034878251684978906,\n \"acc_norm\": 0.7300613496932515,\n \"acc_norm_stderr\": 0.034878251684978906\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.49107142857142855,\n \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.49107142857142855,\n \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7378640776699029,\n \"acc_stderr\": 0.04354631077260594,\n \"acc_norm\": 0.7378640776699029,\n \"acc_norm_stderr\": 0.04354631077260594\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7991452991452992,\n \"acc_stderr\": 0.026246772946890484,\n \"acc_norm\": 0.7991452991452992,\n \"acc_norm_stderr\": 0.026246772946890484\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.65,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6973180076628352,\n \"acc_stderr\": 0.01642878158174936,\n \"acc_norm\": 0.6973180076628352,\n \"acc_norm_stderr\": 0.01642878158174936\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6560693641618497,\n \"acc_stderr\": 0.025574123786546672,\n \"acc_norm\": 0.6560693641618497,\n \"acc_norm_stderr\": 0.025574123786546672\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.27262569832402234,\n \"acc_stderr\": 0.014893391735249622,\n \"acc_norm\": 0.27262569832402234,\n \"acc_norm_stderr\": 0.014893391735249622\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6209150326797386,\n \"acc_stderr\": 0.027780141207023344,\n \"acc_norm\": 0.6209150326797386,\n \"acc_norm_stderr\": 0.027780141207023344\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6463022508038585,\n \"acc_stderr\": 0.027155208103200868,\n \"acc_norm\": 0.6463022508038585,\n \"acc_norm_stderr\": 0.027155208103200868\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.6111111111111112,\n \"acc_stderr\": 0.027125115513166848,\n \"acc_norm\": 0.6111111111111112,\n \"acc_norm_stderr\": 0.027125115513166848\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.425531914893617,\n \"acc_stderr\": 0.02949482760014437,\n \"acc_norm\": 0.425531914893617,\n \"acc_norm_stderr\": 0.02949482760014437\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4282920469361147,\n \"acc_stderr\": 0.012638223880313161,\n \"acc_norm\": 0.4282920469361147,\n \"acc_norm_stderr\": 0.012638223880313161\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.45588235294117646,\n \"acc_stderr\": 0.03025437257397669,\n \"acc_norm\": 0.45588235294117646,\n \"acc_norm_stderr\": 0.03025437257397669\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.5588235294117647,\n \"acc_stderr\": 0.020087362076702853,\n \"acc_norm\": 0.5588235294117647,\n \"acc_norm_stderr\": 0.020087362076702853\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n \"acc_stderr\": 0.04525393596302505,\n \"acc_norm\": 0.6636363636363637,\n \"acc_norm_stderr\": 0.04525393596302505\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7061224489795919,\n \"acc_stderr\": 0.02916273841024977,\n \"acc_norm\": 0.7061224489795919,\n \"acc_norm_stderr\": 0.02916273841024977\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8059701492537313,\n \"acc_stderr\": 0.027962677604768928,\n \"acc_norm\": 0.8059701492537313,\n \"acc_norm_stderr\": 0.027962677604768928\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4759036144578313,\n \"acc_stderr\": 0.038879718495972646,\n \"acc_norm\": 0.4759036144578313,\n \"acc_norm_stderr\": 0.038879718495972646\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.6783625730994152,\n \"acc_stderr\": 0.03582529442573122,\n \"acc_norm\": 0.6783625730994152,\n \"acc_norm_stderr\": 0.03582529442573122\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.30966952264381886,\n \"mc1_stderr\": 0.016185744355144912,\n \"mc2\": 0.44956524421338884,\n \"mc2_stderr\": 0.015113609603273521\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7419100236779794,\n \"acc_stderr\": 0.01229827883397239\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5253980288097043,\n \"acc_stderr\": 0.013754705089112314\n }\n}\n```", "repo_url": "https://huggingface.co/huseyinatahaninan/phi-2-instruction", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_02T02_10_43.910998", "path": ["**/details_harness|arc:challenge|25_2024-02-02T02-10-43.910998.parquet"]}, {"split": "2024_02_02T15_24_48.526289", "path": ["**/details_harness|arc:challenge|25_2024-02-02T15-24-48.526289.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-02T15-24-48.526289.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_02T02_10_43.910998", "path": ["**/details_harness|gsm8k|5_2024-02-02T02-10-43.910998.parquet"]}, {"split": "2024_02_02T15_24_48.526289", "path": ["**/details_harness|gsm8k|5_2024-02-02T15-24-48.526289.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-02T15-24-48.526289.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_02T02_10_43.910998", "path": ["**/details_harness|hellaswag|10_2024-02-02T02-10-43.910998.parquet"]}, {"split": "2024_02_02T15_24_48.526289", "path": ["**/details_harness|hellaswag|10_2024-02-02T15-24-48.526289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-02T15-24-48.526289.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_02T02_10_43.910998", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T02-10-43.910998.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-02T02-10-43.910998.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-02T02-10-43.910998.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T02-10-43.910998.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T02-10-43.910998.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-02T02-10-43.910998.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T02-10-43.910998.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T02-10-43.910998.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T02-10-43.910998.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T02-10-43.910998.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-02T02-10-43.910998.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-02T02-10-43.910998.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T02-10-43.910998.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-02T02-10-43.910998.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T02-10-43.910998.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T02-10-43.910998.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T02-10-43.910998.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-02T02-10-43.910998.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T02-10-43.910998.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T02-10-43.910998.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T02-10-43.910998.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T02-10-43.910998.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T02-10-43.910998.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T02-10-43.910998.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T02-10-43.910998.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T02-10-43.910998.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T02-10-43.910998.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T02-10-43.910998.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T02-10-43.910998.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T02-10-43.910998.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T02-10-43.910998.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T02-10-43.910998.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-02T02-10-43.910998.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T02-10-43.910998.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-02T02-10-43.910998.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T02-10-43.910998.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T02-10-43.910998.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T02-10-43.910998.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-02T02-10-43.910998.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-02T02-10-43.910998.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T02-10-43.910998.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T02-10-43.910998.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T02-10-43.910998.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T02-10-43.910998.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-02T02-10-43.910998.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-02T02-10-43.910998.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-02T02-10-43.910998.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T02-10-43.910998.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-02T02-10-43.910998.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T02-10-43.910998.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T02-10-43.910998.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-02T02-10-43.910998.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-02T02-10-43.910998.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-02T02-10-43.910998.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T02-10-43.910998.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-02T02-10-43.910998.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-02T02-10-43.910998.parquet"]}, {"split": "2024_02_02T15_24_48.526289", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T15-24-48.526289.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-02T15-24-48.526289.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-02T15-24-48.526289.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T15-24-48.526289.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T15-24-48.526289.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-02T15-24-48.526289.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T15-24-48.526289.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T15-24-48.526289.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T15-24-48.526289.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T15-24-48.526289.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-02T15-24-48.526289.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-02T15-24-48.526289.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T15-24-48.526289.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-02T15-24-48.526289.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T15-24-48.526289.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T15-24-48.526289.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T15-24-48.526289.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-02T15-24-48.526289.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T15-24-48.526289.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T15-24-48.526289.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T15-24-48.526289.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T15-24-48.526289.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T15-24-48.526289.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T15-24-48.526289.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T15-24-48.526289.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T15-24-48.526289.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T15-24-48.526289.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T15-24-48.526289.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T15-24-48.526289.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T15-24-48.526289.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T15-24-48.526289.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T15-24-48.526289.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-02T15-24-48.526289.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T15-24-48.526289.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-02T15-24-48.526289.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T15-24-48.526289.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T15-24-48.526289.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T15-24-48.526289.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-02T15-24-48.526289.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-02T15-24-48.526289.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T15-24-48.526289.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T15-24-48.526289.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T15-24-48.526289.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T15-24-48.526289.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-02T15-24-48.526289.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-02T15-24-48.526289.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-02T15-24-48.526289.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T15-24-48.526289.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-02T15-24-48.526289.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T15-24-48.526289.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T15-24-48.526289.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-02T15-24-48.526289.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-02T15-24-48.526289.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-02T15-24-48.526289.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T15-24-48.526289.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-02T15-24-48.526289.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-02T15-24-48.526289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T15-24-48.526289.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-02T15-24-48.526289.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-02T15-24-48.526289.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T15-24-48.526289.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T15-24-48.526289.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-02T15-24-48.526289.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T15-24-48.526289.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T15-24-48.526289.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T15-24-48.526289.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T15-24-48.526289.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-02T15-24-48.526289.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-02T15-24-48.526289.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T15-24-48.526289.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-02T15-24-48.526289.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T15-24-48.526289.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T15-24-48.526289.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T15-24-48.526289.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-02T15-24-48.526289.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T15-24-48.526289.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T15-24-48.526289.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T15-24-48.526289.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T15-24-48.526289.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T15-24-48.526289.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T15-24-48.526289.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T15-24-48.526289.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T15-24-48.526289.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T15-24-48.526289.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T15-24-48.526289.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T15-24-48.526289.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T15-24-48.526289.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T15-24-48.526289.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T15-24-48.526289.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-02T15-24-48.526289.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T15-24-48.526289.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-02T15-24-48.526289.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T15-24-48.526289.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T15-24-48.526289.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T15-24-48.526289.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-02T15-24-48.526289.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-02T15-24-48.526289.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T15-24-48.526289.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T15-24-48.526289.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T15-24-48.526289.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T15-24-48.526289.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-02T15-24-48.526289.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-02T15-24-48.526289.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-02T15-24-48.526289.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T15-24-48.526289.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-02T15-24-48.526289.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T15-24-48.526289.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T15-24-48.526289.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-02T15-24-48.526289.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-02T15-24-48.526289.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-02T15-24-48.526289.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T15-24-48.526289.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-02T15-24-48.526289.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-02T15-24-48.526289.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_02T02_10_43.910998", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T02-10-43.910998.parquet"]}, {"split": "2024_02_02T15_24_48.526289", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T15-24-48.526289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T15-24-48.526289.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_02T02_10_43.910998", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-02T02-10-43.910998.parquet"]}, {"split": "2024_02_02T15_24_48.526289", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-02T15-24-48.526289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-02T15-24-48.526289.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_02T02_10_43.910998", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-02T02-10-43.910998.parquet"]}, {"split": "2024_02_02T15_24_48.526289", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-02T15-24-48.526289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-02T15-24-48.526289.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_02T02_10_43.910998", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T02-10-43.910998.parquet"]}, {"split": "2024_02_02T15_24_48.526289", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T15-24-48.526289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T15-24-48.526289.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_02T02_10_43.910998", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T02-10-43.910998.parquet"]}, {"split": "2024_02_02T15_24_48.526289", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T15-24-48.526289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T15-24-48.526289.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_02T02_10_43.910998", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-02T02-10-43.910998.parquet"]}, {"split": "2024_02_02T15_24_48.526289", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-02T15-24-48.526289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-02T15-24-48.526289.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_02T02_10_43.910998", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T02-10-43.910998.parquet"]}, {"split": "2024_02_02T15_24_48.526289", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T15-24-48.526289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T15-24-48.526289.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_02T02_10_43.910998", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T02-10-43.910998.parquet"]}, {"split": "2024_02_02T15_24_48.526289", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T15-24-48.526289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T15-24-48.526289.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_02T02_10_43.910998", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T02-10-43.910998.parquet"]}, {"split": "2024_02_02T15_24_48.526289", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T15-24-48.526289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T15-24-48.526289.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_02T02_10_43.910998", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T02-10-43.910998.parquet"]}, {"split": "2024_02_02T15_24_48.526289", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T15-24-48.526289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T15-24-48.526289.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_02T02_10_43.910998", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-02T02-10-43.910998.parquet"]}, {"split": "2024_02_02T15_24_48.526289", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-02T15-24-48.526289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-02T15-24-48.526289.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_02T02_10_43.910998", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-02T02-10-43.910998.parquet"]}, {"split": "2024_02_02T15_24_48.526289", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-02T15-24-48.526289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-02T15-24-48.526289.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_02T02_10_43.910998", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T02-10-43.910998.parquet"]}, {"split": "2024_02_02T15_24_48.526289", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T15-24-48.526289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T15-24-48.526289.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_02T02_10_43.910998", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-02T02-10-43.910998.parquet"]}, {"split": "2024_02_02T15_24_48.526289", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-02T15-24-48.526289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-02T15-24-48.526289.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_02T02_10_43.910998", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T02-10-43.910998.parquet"]}, {"split": "2024_02_02T15_24_48.526289", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T15-24-48.526289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T15-24-48.526289.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_02T02_10_43.910998", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T02-10-43.910998.parquet"]}, {"split": "2024_02_02T15_24_48.526289", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T15-24-48.526289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T15-24-48.526289.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_02T02_10_43.910998", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T02-10-43.910998.parquet"]}, {"split": "2024_02_02T15_24_48.526289", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T15-24-48.526289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T15-24-48.526289.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_02T02_10_43.910998", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-02T02-10-43.910998.parquet"]}, {"split": "2024_02_02T15_24_48.526289", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-02T15-24-48.526289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-02T15-24-48.526289.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_02T02_10_43.910998", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T02-10-43.910998.parquet"]}, {"split": "2024_02_02T15_24_48.526289", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T15-24-48.526289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T15-24-48.526289.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_02T02_10_43.910998", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T02-10-43.910998.parquet"]}, {"split": "2024_02_02T15_24_48.526289", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T15-24-48.526289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T15-24-48.526289.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_02T02_10_43.910998", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T02-10-43.910998.parquet"]}, {"split": "2024_02_02T15_24_48.526289", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T15-24-48.526289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T15-24-48.526289.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_02T02_10_43.910998", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T02-10-43.910998.parquet"]}, {"split": "2024_02_02T15_24_48.526289", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T15-24-48.526289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T15-24-48.526289.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_02T02_10_43.910998", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T02-10-43.910998.parquet"]}, {"split": "2024_02_02T15_24_48.526289", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T15-24-48.526289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T15-24-48.526289.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_02T02_10_43.910998", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T02-10-43.910998.parquet"]}, {"split": "2024_02_02T15_24_48.526289", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T15-24-48.526289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T15-24-48.526289.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_02T02_10_43.910998", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T02-10-43.910998.parquet"]}, {"split": "2024_02_02T15_24_48.526289", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T15-24-48.526289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T15-24-48.526289.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_02T02_10_43.910998", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T02-10-43.910998.parquet"]}, {"split": "2024_02_02T15_24_48.526289", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T15-24-48.526289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T15-24-48.526289.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_02T02_10_43.910998", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T02-10-43.910998.parquet"]}, {"split": "2024_02_02T15_24_48.526289", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T15-24-48.526289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T15-24-48.526289.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_02T02_10_43.910998", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T02-10-43.910998.parquet"]}, {"split": "2024_02_02T15_24_48.526289", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T15-24-48.526289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T15-24-48.526289.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_02T02_10_43.910998", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T02-10-43.910998.parquet"]}, {"split": "2024_02_02T15_24_48.526289", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T15-24-48.526289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T15-24-48.526289.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_02T02_10_43.910998", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T02-10-43.910998.parquet"]}, {"split": "2024_02_02T15_24_48.526289", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T15-24-48.526289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T15-24-48.526289.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_02T02_10_43.910998", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T02-10-43.910998.parquet"]}, {"split": "2024_02_02T15_24_48.526289", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T15-24-48.526289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T15-24-48.526289.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_02T02_10_43.910998", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T02-10-43.910998.parquet"]}, {"split": "2024_02_02T15_24_48.526289", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T15-24-48.526289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T15-24-48.526289.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_02T02_10_43.910998", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-02T02-10-43.910998.parquet"]}, {"split": "2024_02_02T15_24_48.526289", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-02T15-24-48.526289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-02T15-24-48.526289.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_02T02_10_43.910998", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T02-10-43.910998.parquet"]}, {"split": "2024_02_02T15_24_48.526289", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T15-24-48.526289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T15-24-48.526289.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_02T02_10_43.910998", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-02T02-10-43.910998.parquet"]}, {"split": "2024_02_02T15_24_48.526289", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-02T15-24-48.526289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-02T15-24-48.526289.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_02T02_10_43.910998", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T02-10-43.910998.parquet"]}, {"split": "2024_02_02T15_24_48.526289", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T15-24-48.526289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T15-24-48.526289.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_02T02_10_43.910998", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T02-10-43.910998.parquet"]}, {"split": "2024_02_02T15_24_48.526289", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T15-24-48.526289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T15-24-48.526289.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_02T02_10_43.910998", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T02-10-43.910998.parquet"]}, {"split": "2024_02_02T15_24_48.526289", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T15-24-48.526289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T15-24-48.526289.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_02T02_10_43.910998", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-02T02-10-43.910998.parquet"]}, {"split": "2024_02_02T15_24_48.526289", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-02T15-24-48.526289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-02T15-24-48.526289.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_02T02_10_43.910998", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-02T02-10-43.910998.parquet"]}, {"split": "2024_02_02T15_24_48.526289", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-02T15-24-48.526289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-02T15-24-48.526289.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_02T02_10_43.910998", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T02-10-43.910998.parquet"]}, {"split": "2024_02_02T15_24_48.526289", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T15-24-48.526289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T15-24-48.526289.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_02T02_10_43.910998", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T02-10-43.910998.parquet"]}, {"split": "2024_02_02T15_24_48.526289", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T15-24-48.526289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T15-24-48.526289.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_02T02_10_43.910998", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T02-10-43.910998.parquet"]}, {"split": "2024_02_02T15_24_48.526289", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T15-24-48.526289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T15-24-48.526289.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_02T02_10_43.910998", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T02-10-43.910998.parquet"]}, {"split": "2024_02_02T15_24_48.526289", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T15-24-48.526289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T15-24-48.526289.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_02T02_10_43.910998", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-02T02-10-43.910998.parquet"]}, {"split": "2024_02_02T15_24_48.526289", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-02T15-24-48.526289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-02T15-24-48.526289.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_02T02_10_43.910998", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-02T02-10-43.910998.parquet"]}, {"split": "2024_02_02T15_24_48.526289", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-02T15-24-48.526289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-02T15-24-48.526289.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_02T02_10_43.910998", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-02T02-10-43.910998.parquet"]}, {"split": "2024_02_02T15_24_48.526289", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-02T15-24-48.526289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-02T15-24-48.526289.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_02T02_10_43.910998", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T02-10-43.910998.parquet"]}, {"split": "2024_02_02T15_24_48.526289", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T15-24-48.526289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T15-24-48.526289.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_02T02_10_43.910998", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-02T02-10-43.910998.parquet"]}, {"split": "2024_02_02T15_24_48.526289", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-02T15-24-48.526289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-02T15-24-48.526289.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_02T02_10_43.910998", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T02-10-43.910998.parquet"]}, {"split": "2024_02_02T15_24_48.526289", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T15-24-48.526289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T15-24-48.526289.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_02T02_10_43.910998", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T02-10-43.910998.parquet"]}, {"split": "2024_02_02T15_24_48.526289", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T15-24-48.526289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T15-24-48.526289.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_02T02_10_43.910998", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-02T02-10-43.910998.parquet"]}, {"split": "2024_02_02T15_24_48.526289", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-02T15-24-48.526289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-02T15-24-48.526289.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_02T02_10_43.910998", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-02T02-10-43.910998.parquet"]}, {"split": "2024_02_02T15_24_48.526289", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-02T15-24-48.526289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-02T15-24-48.526289.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_02T02_10_43.910998", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-02T02-10-43.910998.parquet"]}, {"split": "2024_02_02T15_24_48.526289", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-02T15-24-48.526289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-02T15-24-48.526289.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_02T02_10_43.910998", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T02-10-43.910998.parquet"]}, {"split": "2024_02_02T15_24_48.526289", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T15-24-48.526289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T15-24-48.526289.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_02T02_10_43.910998", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-02T02-10-43.910998.parquet"]}, {"split": "2024_02_02T15_24_48.526289", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-02T15-24-48.526289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-02T15-24-48.526289.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_02T02_10_43.910998", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-02T02-10-43.910998.parquet"]}, {"split": "2024_02_02T15_24_48.526289", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-02T15-24-48.526289.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-02T15-24-48.526289.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_02T02_10_43.910998", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-02T02-10-43.910998.parquet"]}, {"split": "2024_02_02T15_24_48.526289", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-02T15-24-48.526289.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-02T15-24-48.526289.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_02T02_10_43.910998", "path": ["**/details_harness|winogrande|5_2024-02-02T02-10-43.910998.parquet"]}, {"split": "2024_02_02T15_24_48.526289", "path": ["**/details_harness|winogrande|5_2024-02-02T15-24-48.526289.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-02T15-24-48.526289.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_02T02_10_43.910998", "path": ["results_2024-02-02T02-10-43.910998.parquet"]}, {"split": "2024_02_02T15_24_48.526289", "path": ["results_2024-02-02T15-24-48.526289.parquet"]}, {"split": "latest", "path": ["results_2024-02-02T15-24-48.526289.parquet"]}]}]}
2024-02-02T15:26:55+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of huseyinatahaninan/phi-2-instruction Dataset automatically created during the evaluation run of model huseyinatahaninan/phi-2-instruction on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-02-02T15:24:48.526289(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of huseyinatahaninan/phi-2-instruction\n\n\n\nDataset automatically created during the evaluation run of model huseyinatahaninan/phi-2-instruction on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-02-02T15:24:48.526289(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of huseyinatahaninan/phi-2-instruction\n\n\n\nDataset automatically created during the evaluation run of model huseyinatahaninan/phi-2-instruction on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-02-02T15:24:48.526289(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
461abdb8b77984c263f8f89bebf8b06a9613401a
$\color{#FF0000}{红色字}$即可 # Dataset Card for Dataset Name <!-- Provide a quick summary of the dataset. --> This dataset card aims to be a base template for new datasets. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/datasetcard_template.md?plain=1). ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
kevinliu0619/2.2aaa
[ "region:us" ]
2024-02-02T02:16:50+00:00
{}
2024-02-02T02:23:56+00:00
[]
[]
TAGS #region-us
$\color{#FF0000}{红色字}$即可 # Dataset Card for Dataset Name This dataset card aims to be a base template for new datasets. It has been generated using this raw template. ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Dataset Name\n\n\n\nThis dataset card aims to be a base template for new datasets. It has been generated using this raw template.", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Dataset Name\n\n\n\nThis dataset card aims to be a base template for new datasets. It has been generated using this raw template.", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
79476531836cb653299202077face288f47c2c70
# Dataset Card for Evaluation run of shadowml/BeagSake-7B <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [shadowml/BeagSake-7B](https://huggingface.co/shadowml/BeagSake-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_shadowml__BeagSake-7B", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-02-02T02:17:55.720311](https://huggingface.co/datasets/open-llm-leaderboard/details_shadowml__BeagSake-7B/blob/main/results_2024-02-02T02-17-55.720311.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6578211772363932, "acc_stderr": 0.031956725676144875, "acc_norm": 0.6574768410421444, "acc_norm_stderr": 0.0326189871206691, "mc1": 0.572827417380661, "mc1_stderr": 0.01731683441096392, "mc2": 0.7227123192569592, "mc2_stderr": 0.01451322669078661 }, "harness|arc:challenge|25": { "acc": 0.7005119453924915, "acc_stderr": 0.013385021637313572, "acc_norm": 0.7244027303754266, "acc_norm_stderr": 0.01305716965576184 }, "harness|hellaswag|10": { "acc": 0.704142601075483, "acc_stderr": 0.0045549440206204845, "acc_norm": 0.8838876717785302, "acc_norm_stderr": 0.0031970484760036424 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.35, "acc_stderr": 0.0479372485441102, "acc_norm": 0.35, "acc_norm_stderr": 0.0479372485441102 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.674074074074074, "acc_stderr": 0.040491220417025055, "acc_norm": 0.674074074074074, "acc_norm_stderr": 0.040491220417025055 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.7105263157894737, "acc_stderr": 0.03690677986137283, "acc_norm": 0.7105263157894737, "acc_norm_stderr": 0.03690677986137283 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.63, "acc_stderr": 0.04852365870939099, "acc_norm": 0.63, "acc_norm_stderr": 0.04852365870939099 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.7056603773584905, "acc_stderr": 0.02804918631569525, "acc_norm": 0.7056603773584905, "acc_norm_stderr": 0.02804918631569525 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.8055555555555556, "acc_stderr": 0.03309615177059004, "acc_norm": 0.8055555555555556, "acc_norm_stderr": 0.03309615177059004 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.52, "acc_stderr": 0.050211673156867795, "acc_norm": 0.52, "acc_norm_stderr": 0.050211673156867795 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.54, "acc_stderr": 0.05009082659620333, "acc_norm": 0.54, "acc_norm_stderr": 0.05009082659620333 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.31, "acc_stderr": 0.04648231987117316, "acc_norm": 0.31, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6763005780346821, "acc_stderr": 0.0356760379963917, "acc_norm": 0.6763005780346821, "acc_norm_stderr": 0.0356760379963917 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.4215686274509804, "acc_stderr": 0.04913595201274498, "acc_norm": 0.4215686274509804, "acc_norm_stderr": 0.04913595201274498 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.75, "acc_stderr": 0.04351941398892446, "acc_norm": 0.75, "acc_norm_stderr": 0.04351941398892446 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.5829787234042553, "acc_stderr": 0.03223276266711712, "acc_norm": 0.5829787234042553, "acc_norm_stderr": 0.03223276266711712 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.4473684210526316, "acc_stderr": 0.04677473004491199, "acc_norm": 0.4473684210526316, "acc_norm_stderr": 0.04677473004491199 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5586206896551724, "acc_stderr": 0.04137931034482757, "acc_norm": 0.5586206896551724, "acc_norm_stderr": 0.04137931034482757 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.4021164021164021, "acc_stderr": 0.025253032554997695, "acc_norm": 0.4021164021164021, "acc_norm_stderr": 0.025253032554997695 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.47619047619047616, "acc_stderr": 0.04467062628403273, "acc_norm": 0.47619047619047616, "acc_norm_stderr": 0.04467062628403273 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.34, "acc_stderr": 0.04760952285695235, "acc_norm": 0.34, "acc_norm_stderr": 0.04760952285695235 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.7903225806451613, "acc_stderr": 0.023157879349083525, "acc_norm": 0.7903225806451613, "acc_norm_stderr": 0.023157879349083525 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.4975369458128079, "acc_stderr": 0.03517945038691063, "acc_norm": 0.4975369458128079, "acc_norm_stderr": 0.03517945038691063 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.72, "acc_stderr": 0.04512608598542127, "acc_norm": 0.72, "acc_norm_stderr": 0.04512608598542127 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7636363636363637, "acc_stderr": 0.03317505930009182, "acc_norm": 0.7636363636363637, "acc_norm_stderr": 0.03317505930009182 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.7929292929292929, "acc_stderr": 0.028869778460267045, "acc_norm": 0.7929292929292929, "acc_norm_stderr": 0.028869778460267045 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.9119170984455959, "acc_stderr": 0.02045374660160103, "acc_norm": 0.9119170984455959, "acc_norm_stderr": 0.02045374660160103 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.6820512820512821, "acc_stderr": 0.023610884308927865, "acc_norm": 0.6820512820512821, "acc_norm_stderr": 0.023610884308927865 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.34444444444444444, "acc_stderr": 0.02897264888484427, "acc_norm": 0.34444444444444444, "acc_norm_stderr": 0.02897264888484427 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.6764705882352942, "acc_stderr": 0.030388353551886793, "acc_norm": 0.6764705882352942, "acc_norm_stderr": 0.030388353551886793 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.3708609271523179, "acc_stderr": 0.03943966699183629, "acc_norm": 0.3708609271523179, "acc_norm_stderr": 0.03943966699183629 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8458715596330275, "acc_stderr": 0.015480826865374303, "acc_norm": 0.8458715596330275, "acc_norm_stderr": 0.015480826865374303 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.5277777777777778, "acc_stderr": 0.0340470532865388, "acc_norm": 0.5277777777777778, "acc_norm_stderr": 0.0340470532865388 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.8431372549019608, "acc_stderr": 0.02552472232455335, "acc_norm": 0.8431372549019608, "acc_norm_stderr": 0.02552472232455335 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.810126582278481, "acc_stderr": 0.02553010046023349, "acc_norm": 0.810126582278481, "acc_norm_stderr": 0.02553010046023349 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.695067264573991, "acc_stderr": 0.030898610882477515, "acc_norm": 0.695067264573991, "acc_norm_stderr": 0.030898610882477515 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.7862595419847328, "acc_stderr": 0.0359546161177469, "acc_norm": 0.7862595419847328, "acc_norm_stderr": 0.0359546161177469 }, "harness|hendrycksTest-international_law|5": { "acc": 0.7851239669421488, "acc_stderr": 0.037494924487096966, "acc_norm": 0.7851239669421488, "acc_norm_stderr": 0.037494924487096966 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7777777777777778, "acc_stderr": 0.0401910747255735, "acc_norm": 0.7777777777777778, "acc_norm_stderr": 0.0401910747255735 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7607361963190185, "acc_stderr": 0.0335195387952127, "acc_norm": 0.7607361963190185, "acc_norm_stderr": 0.0335195387952127 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.4732142857142857, "acc_stderr": 0.047389751192741546, "acc_norm": 0.4732142857142857, "acc_norm_stderr": 0.047389751192741546 }, "harness|hendrycksTest-management|5": { "acc": 0.7864077669902912, "acc_stderr": 0.040580420156460344, "acc_norm": 0.7864077669902912, "acc_norm_stderr": 0.040580420156460344 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8803418803418803, "acc_stderr": 0.021262719400406964, "acc_norm": 0.8803418803418803, "acc_norm_stderr": 0.021262719400406964 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.7, "acc_stderr": 0.046056618647183814, "acc_norm": 0.7, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8301404853128991, "acc_stderr": 0.013428186370608303, "acc_norm": 0.8301404853128991, "acc_norm_stderr": 0.013428186370608303 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.7427745664739884, "acc_stderr": 0.023532925431044283, "acc_norm": 0.7427745664739884, "acc_norm_stderr": 0.023532925431044283 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.4223463687150838, "acc_stderr": 0.016519594275297117, "acc_norm": 0.4223463687150838, "acc_norm_stderr": 0.016519594275297117 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.7254901960784313, "acc_stderr": 0.025553169991826524, "acc_norm": 0.7254901960784313, "acc_norm_stderr": 0.025553169991826524 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.7266881028938906, "acc_stderr": 0.025311765975426122, "acc_norm": 0.7266881028938906, "acc_norm_stderr": 0.025311765975426122 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.75, "acc_stderr": 0.02409347123262133, "acc_norm": 0.75, "acc_norm_stderr": 0.02409347123262133 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.49645390070921985, "acc_stderr": 0.02982674915328092, "acc_norm": 0.49645390070921985, "acc_norm_stderr": 0.02982674915328092 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.47522816166883963, "acc_stderr": 0.012754553719781753, "acc_norm": 0.47522816166883963, "acc_norm_stderr": 0.012754553719781753 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.6764705882352942, "acc_stderr": 0.028418208619406755, "acc_norm": 0.6764705882352942, "acc_norm_stderr": 0.028418208619406755 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.684640522875817, "acc_stderr": 0.018798086284886887, "acc_norm": 0.684640522875817, "acc_norm_stderr": 0.018798086284886887 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6909090909090909, "acc_stderr": 0.044262946482000985, "acc_norm": 0.6909090909090909, "acc_norm_stderr": 0.044262946482000985 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.7306122448979592, "acc_stderr": 0.02840125202902294, "acc_norm": 0.7306122448979592, "acc_norm_stderr": 0.02840125202902294 }, "harness|hendrycksTest-sociology|5": { "acc": 0.845771144278607, "acc_stderr": 0.02553843336857833, "acc_norm": 0.845771144278607, "acc_norm_stderr": 0.02553843336857833 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.86, "acc_stderr": 0.0348735088019777, "acc_norm": 0.86, "acc_norm_stderr": 0.0348735088019777 }, "harness|hendrycksTest-virology|5": { "acc": 0.5602409638554217, "acc_stderr": 0.03864139923699122, "acc_norm": 0.5602409638554217, "acc_norm_stderr": 0.03864139923699122 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8421052631578947, "acc_stderr": 0.027966785859160893, "acc_norm": 0.8421052631578947, "acc_norm_stderr": 0.027966785859160893 }, "harness|truthfulqa:mc|0": { "mc1": 0.572827417380661, "mc1_stderr": 0.01731683441096392, "mc2": 0.7227123192569592, "mc2_stderr": 0.01451322669078661 }, "harness|winogrande|5": { "acc": 0.8216258879242304, "acc_stderr": 0.010759352014855936 }, "harness|gsm8k|5": { "acc": 0.7179681576952237, "acc_stderr": 0.0123949265843357 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_shadowml__BeagSake-7B
[ "region:us" ]
2024-02-02T02:20:18+00:00
{"pretty_name": "Evaluation run of shadowml/BeagSake-7B", "dataset_summary": "Dataset automatically created during the evaluation run of model [shadowml/BeagSake-7B](https://huggingface.co/shadowml/BeagSake-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_shadowml__BeagSake-7B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-02T02:17:55.720311](https://huggingface.co/datasets/open-llm-leaderboard/details_shadowml__BeagSake-7B/blob/main/results_2024-02-02T02-17-55.720311.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6578211772363932,\n \"acc_stderr\": 0.031956725676144875,\n \"acc_norm\": 0.6574768410421444,\n \"acc_norm_stderr\": 0.0326189871206691,\n \"mc1\": 0.572827417380661,\n \"mc1_stderr\": 0.01731683441096392,\n \"mc2\": 0.7227123192569592,\n \"mc2_stderr\": 0.01451322669078661\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.7005119453924915,\n \"acc_stderr\": 0.013385021637313572,\n \"acc_norm\": 0.7244027303754266,\n \"acc_norm_stderr\": 0.01305716965576184\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.704142601075483,\n \"acc_stderr\": 0.0045549440206204845,\n \"acc_norm\": 0.8838876717785302,\n \"acc_norm_stderr\": 0.0031970484760036424\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.674074074074074,\n \"acc_stderr\": 0.040491220417025055,\n \"acc_norm\": 0.674074074074074,\n \"acc_norm_stderr\": 0.040491220417025055\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7105263157894737,\n \"acc_stderr\": 0.03690677986137283,\n \"acc_norm\": 0.7105263157894737,\n \"acc_norm_stderr\": 0.03690677986137283\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.63,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7056603773584905,\n \"acc_stderr\": 0.02804918631569525,\n \"acc_norm\": 0.7056603773584905,\n \"acc_norm_stderr\": 0.02804918631569525\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8055555555555556,\n \"acc_stderr\": 0.03309615177059004,\n \"acc_norm\": 0.8055555555555556,\n \"acc_norm_stderr\": 0.03309615177059004\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6763005780346821,\n \"acc_stderr\": 0.0356760379963917,\n \"acc_norm\": 0.6763005780346821,\n \"acc_norm_stderr\": 0.0356760379963917\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4215686274509804,\n \"acc_stderr\": 0.04913595201274498,\n \"acc_norm\": 0.4215686274509804,\n \"acc_norm_stderr\": 0.04913595201274498\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5829787234042553,\n \"acc_stderr\": 0.03223276266711712,\n \"acc_norm\": 0.5829787234042553,\n \"acc_norm_stderr\": 0.03223276266711712\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4473684210526316,\n \"acc_stderr\": 0.04677473004491199,\n \"acc_norm\": 0.4473684210526316,\n \"acc_norm_stderr\": 0.04677473004491199\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5586206896551724,\n \"acc_stderr\": 0.04137931034482757,\n \"acc_norm\": 0.5586206896551724,\n \"acc_norm_stderr\": 0.04137931034482757\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.4021164021164021,\n \"acc_stderr\": 0.025253032554997695,\n \"acc_norm\": 0.4021164021164021,\n \"acc_norm_stderr\": 0.025253032554997695\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.47619047619047616,\n \"acc_stderr\": 0.04467062628403273,\n \"acc_norm\": 0.47619047619047616,\n \"acc_norm_stderr\": 0.04467062628403273\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7903225806451613,\n \"acc_stderr\": 0.023157879349083525,\n \"acc_norm\": 0.7903225806451613,\n \"acc_norm_stderr\": 0.023157879349083525\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4975369458128079,\n \"acc_stderr\": 0.03517945038691063,\n \"acc_norm\": 0.4975369458128079,\n \"acc_norm_stderr\": 0.03517945038691063\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7636363636363637,\n \"acc_stderr\": 0.03317505930009182,\n \"acc_norm\": 0.7636363636363637,\n \"acc_norm_stderr\": 0.03317505930009182\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7929292929292929,\n \"acc_stderr\": 0.028869778460267045,\n \"acc_norm\": 0.7929292929292929,\n \"acc_norm_stderr\": 0.028869778460267045\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9119170984455959,\n \"acc_stderr\": 0.02045374660160103,\n \"acc_norm\": 0.9119170984455959,\n \"acc_norm_stderr\": 0.02045374660160103\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6820512820512821,\n \"acc_stderr\": 0.023610884308927865,\n \"acc_norm\": 0.6820512820512821,\n \"acc_norm_stderr\": 0.023610884308927865\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.34444444444444444,\n \"acc_stderr\": 0.02897264888484427,\n \"acc_norm\": 0.34444444444444444,\n \"acc_norm_stderr\": 0.02897264888484427\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.030388353551886793,\n \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.030388353551886793\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3708609271523179,\n \"acc_stderr\": 0.03943966699183629,\n \"acc_norm\": 0.3708609271523179,\n \"acc_norm_stderr\": 0.03943966699183629\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8458715596330275,\n \"acc_stderr\": 0.015480826865374303,\n \"acc_norm\": 0.8458715596330275,\n \"acc_norm_stderr\": 0.015480826865374303\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5277777777777778,\n \"acc_stderr\": 0.0340470532865388,\n \"acc_norm\": 0.5277777777777778,\n \"acc_norm_stderr\": 0.0340470532865388\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8431372549019608,\n \"acc_stderr\": 0.02552472232455335,\n \"acc_norm\": 0.8431372549019608,\n \"acc_norm_stderr\": 0.02552472232455335\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.810126582278481,\n \"acc_stderr\": 0.02553010046023349,\n \"acc_norm\": 0.810126582278481,\n \"acc_norm_stderr\": 0.02553010046023349\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.695067264573991,\n \"acc_stderr\": 0.030898610882477515,\n \"acc_norm\": 0.695067264573991,\n \"acc_norm_stderr\": 0.030898610882477515\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7862595419847328,\n \"acc_stderr\": 0.0359546161177469,\n \"acc_norm\": 0.7862595419847328,\n \"acc_norm_stderr\": 0.0359546161177469\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.0401910747255735,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.0401910747255735\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7607361963190185,\n \"acc_stderr\": 0.0335195387952127,\n \"acc_norm\": 0.7607361963190185,\n \"acc_norm_stderr\": 0.0335195387952127\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4732142857142857,\n \"acc_stderr\": 0.047389751192741546,\n \"acc_norm\": 0.4732142857142857,\n \"acc_norm_stderr\": 0.047389751192741546\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7864077669902912,\n \"acc_stderr\": 0.040580420156460344,\n \"acc_norm\": 0.7864077669902912,\n \"acc_norm_stderr\": 0.040580420156460344\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n \"acc_stderr\": 0.021262719400406964,\n \"acc_norm\": 0.8803418803418803,\n \"acc_norm_stderr\": 0.021262719400406964\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8301404853128991,\n \"acc_stderr\": 0.013428186370608303,\n \"acc_norm\": 0.8301404853128991,\n \"acc_norm_stderr\": 0.013428186370608303\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7427745664739884,\n \"acc_stderr\": 0.023532925431044283,\n \"acc_norm\": 0.7427745664739884,\n \"acc_norm_stderr\": 0.023532925431044283\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4223463687150838,\n \"acc_stderr\": 0.016519594275297117,\n \"acc_norm\": 0.4223463687150838,\n \"acc_norm_stderr\": 0.016519594275297117\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7254901960784313,\n \"acc_stderr\": 0.025553169991826524,\n \"acc_norm\": 0.7254901960784313,\n \"acc_norm_stderr\": 0.025553169991826524\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7266881028938906,\n \"acc_stderr\": 0.025311765975426122,\n \"acc_norm\": 0.7266881028938906,\n \"acc_norm_stderr\": 0.025311765975426122\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.02409347123262133,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.02409347123262133\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.49645390070921985,\n \"acc_stderr\": 0.02982674915328092,\n \"acc_norm\": 0.49645390070921985,\n \"acc_norm_stderr\": 0.02982674915328092\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.47522816166883963,\n \"acc_stderr\": 0.012754553719781753,\n \"acc_norm\": 0.47522816166883963,\n \"acc_norm_stderr\": 0.012754553719781753\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.028418208619406755,\n \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.028418208619406755\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.684640522875817,\n \"acc_stderr\": 0.018798086284886887,\n \"acc_norm\": 0.684640522875817,\n \"acc_norm_stderr\": 0.018798086284886887\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6909090909090909,\n \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.6909090909090909,\n \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7306122448979592,\n \"acc_stderr\": 0.02840125202902294,\n \"acc_norm\": 0.7306122448979592,\n \"acc_norm_stderr\": 0.02840125202902294\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.845771144278607,\n \"acc_stderr\": 0.02553843336857833,\n \"acc_norm\": 0.845771144278607,\n \"acc_norm_stderr\": 0.02553843336857833\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.86,\n \"acc_stderr\": 0.0348735088019777,\n \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.0348735088019777\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5602409638554217,\n \"acc_stderr\": 0.03864139923699122,\n \"acc_norm\": 0.5602409638554217,\n \"acc_norm_stderr\": 0.03864139923699122\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8421052631578947,\n \"acc_stderr\": 0.027966785859160893,\n \"acc_norm\": 0.8421052631578947,\n \"acc_norm_stderr\": 0.027966785859160893\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.572827417380661,\n \"mc1_stderr\": 0.01731683441096392,\n \"mc2\": 0.7227123192569592,\n \"mc2_stderr\": 0.01451322669078661\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8216258879242304,\n \"acc_stderr\": 0.010759352014855936\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7179681576952237,\n \"acc_stderr\": 0.0123949265843357\n }\n}\n```", "repo_url": "https://huggingface.co/shadowml/BeagSake-7B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_02T02_17_55.720311", "path": ["**/details_harness|arc:challenge|25_2024-02-02T02-17-55.720311.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-02T02-17-55.720311.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_02T02_17_55.720311", "path": ["**/details_harness|gsm8k|5_2024-02-02T02-17-55.720311.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-02T02-17-55.720311.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_02T02_17_55.720311", "path": ["**/details_harness|hellaswag|10_2024-02-02T02-17-55.720311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-02T02-17-55.720311.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_02T02_17_55.720311", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T02-17-55.720311.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-02T02-17-55.720311.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-02T02-17-55.720311.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T02-17-55.720311.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T02-17-55.720311.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-02T02-17-55.720311.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T02-17-55.720311.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T02-17-55.720311.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T02-17-55.720311.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T02-17-55.720311.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-02T02-17-55.720311.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-02T02-17-55.720311.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T02-17-55.720311.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-02T02-17-55.720311.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T02-17-55.720311.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T02-17-55.720311.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T02-17-55.720311.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-02T02-17-55.720311.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T02-17-55.720311.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T02-17-55.720311.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T02-17-55.720311.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T02-17-55.720311.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T02-17-55.720311.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T02-17-55.720311.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T02-17-55.720311.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T02-17-55.720311.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T02-17-55.720311.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T02-17-55.720311.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T02-17-55.720311.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T02-17-55.720311.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T02-17-55.720311.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T02-17-55.720311.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-02T02-17-55.720311.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T02-17-55.720311.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-02T02-17-55.720311.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T02-17-55.720311.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T02-17-55.720311.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T02-17-55.720311.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-02T02-17-55.720311.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-02T02-17-55.720311.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T02-17-55.720311.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T02-17-55.720311.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T02-17-55.720311.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T02-17-55.720311.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-02T02-17-55.720311.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-02T02-17-55.720311.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-02T02-17-55.720311.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T02-17-55.720311.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-02T02-17-55.720311.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T02-17-55.720311.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T02-17-55.720311.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-02T02-17-55.720311.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-02T02-17-55.720311.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-02T02-17-55.720311.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T02-17-55.720311.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-02T02-17-55.720311.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-02T02-17-55.720311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T02-17-55.720311.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-02T02-17-55.720311.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-02T02-17-55.720311.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T02-17-55.720311.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T02-17-55.720311.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-02T02-17-55.720311.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T02-17-55.720311.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T02-17-55.720311.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T02-17-55.720311.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T02-17-55.720311.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-02T02-17-55.720311.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-02T02-17-55.720311.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T02-17-55.720311.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-02T02-17-55.720311.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T02-17-55.720311.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T02-17-55.720311.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T02-17-55.720311.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-02T02-17-55.720311.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T02-17-55.720311.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T02-17-55.720311.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T02-17-55.720311.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T02-17-55.720311.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T02-17-55.720311.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T02-17-55.720311.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T02-17-55.720311.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T02-17-55.720311.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T02-17-55.720311.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T02-17-55.720311.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T02-17-55.720311.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T02-17-55.720311.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T02-17-55.720311.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T02-17-55.720311.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-02T02-17-55.720311.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T02-17-55.720311.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-02T02-17-55.720311.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T02-17-55.720311.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T02-17-55.720311.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T02-17-55.720311.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-02T02-17-55.720311.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-02T02-17-55.720311.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T02-17-55.720311.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T02-17-55.720311.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T02-17-55.720311.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T02-17-55.720311.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-02T02-17-55.720311.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-02T02-17-55.720311.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-02T02-17-55.720311.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T02-17-55.720311.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-02T02-17-55.720311.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T02-17-55.720311.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T02-17-55.720311.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-02T02-17-55.720311.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-02T02-17-55.720311.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-02T02-17-55.720311.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T02-17-55.720311.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-02T02-17-55.720311.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-02T02-17-55.720311.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_02T02_17_55.720311", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T02-17-55.720311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T02-17-55.720311.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_02T02_17_55.720311", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-02T02-17-55.720311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-02T02-17-55.720311.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_02T02_17_55.720311", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-02T02-17-55.720311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-02T02-17-55.720311.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_02T02_17_55.720311", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T02-17-55.720311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T02-17-55.720311.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_02T02_17_55.720311", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T02-17-55.720311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T02-17-55.720311.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_02T02_17_55.720311", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-02T02-17-55.720311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-02T02-17-55.720311.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_02T02_17_55.720311", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T02-17-55.720311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T02-17-55.720311.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_02T02_17_55.720311", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T02-17-55.720311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T02-17-55.720311.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_02T02_17_55.720311", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T02-17-55.720311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T02-17-55.720311.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_02T02_17_55.720311", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T02-17-55.720311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T02-17-55.720311.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_02T02_17_55.720311", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-02T02-17-55.720311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-02T02-17-55.720311.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_02T02_17_55.720311", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-02T02-17-55.720311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-02T02-17-55.720311.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_02T02_17_55.720311", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T02-17-55.720311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T02-17-55.720311.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_02T02_17_55.720311", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-02T02-17-55.720311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-02T02-17-55.720311.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_02T02_17_55.720311", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T02-17-55.720311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T02-17-55.720311.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_02T02_17_55.720311", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T02-17-55.720311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T02-17-55.720311.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_02T02_17_55.720311", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T02-17-55.720311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T02-17-55.720311.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_02T02_17_55.720311", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-02T02-17-55.720311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-02T02-17-55.720311.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_02T02_17_55.720311", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T02-17-55.720311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T02-17-55.720311.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_02T02_17_55.720311", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T02-17-55.720311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T02-17-55.720311.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_02T02_17_55.720311", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T02-17-55.720311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T02-17-55.720311.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_02T02_17_55.720311", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T02-17-55.720311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T02-17-55.720311.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_02T02_17_55.720311", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T02-17-55.720311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T02-17-55.720311.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_02T02_17_55.720311", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T02-17-55.720311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T02-17-55.720311.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_02T02_17_55.720311", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T02-17-55.720311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T02-17-55.720311.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_02T02_17_55.720311", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T02-17-55.720311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T02-17-55.720311.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_02T02_17_55.720311", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T02-17-55.720311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T02-17-55.720311.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_02T02_17_55.720311", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T02-17-55.720311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T02-17-55.720311.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_02T02_17_55.720311", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T02-17-55.720311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T02-17-55.720311.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_02T02_17_55.720311", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T02-17-55.720311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T02-17-55.720311.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_02T02_17_55.720311", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T02-17-55.720311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T02-17-55.720311.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_02T02_17_55.720311", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T02-17-55.720311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T02-17-55.720311.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_02T02_17_55.720311", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-02T02-17-55.720311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-02T02-17-55.720311.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_02T02_17_55.720311", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T02-17-55.720311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T02-17-55.720311.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_02T02_17_55.720311", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-02T02-17-55.720311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-02T02-17-55.720311.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_02T02_17_55.720311", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T02-17-55.720311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T02-17-55.720311.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_02T02_17_55.720311", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T02-17-55.720311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T02-17-55.720311.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_02T02_17_55.720311", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T02-17-55.720311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T02-17-55.720311.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_02T02_17_55.720311", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-02T02-17-55.720311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-02T02-17-55.720311.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_02T02_17_55.720311", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-02T02-17-55.720311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-02T02-17-55.720311.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_02T02_17_55.720311", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T02-17-55.720311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T02-17-55.720311.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_02T02_17_55.720311", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T02-17-55.720311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T02-17-55.720311.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_02T02_17_55.720311", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T02-17-55.720311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T02-17-55.720311.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_02T02_17_55.720311", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T02-17-55.720311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T02-17-55.720311.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_02T02_17_55.720311", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-02T02-17-55.720311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-02T02-17-55.720311.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_02T02_17_55.720311", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-02T02-17-55.720311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-02T02-17-55.720311.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_02T02_17_55.720311", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-02T02-17-55.720311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-02T02-17-55.720311.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_02T02_17_55.720311", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T02-17-55.720311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T02-17-55.720311.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_02T02_17_55.720311", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-02T02-17-55.720311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-02T02-17-55.720311.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_02T02_17_55.720311", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T02-17-55.720311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T02-17-55.720311.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_02T02_17_55.720311", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T02-17-55.720311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T02-17-55.720311.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_02T02_17_55.720311", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-02T02-17-55.720311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-02T02-17-55.720311.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_02T02_17_55.720311", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-02T02-17-55.720311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-02T02-17-55.720311.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_02T02_17_55.720311", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-02T02-17-55.720311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-02T02-17-55.720311.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_02T02_17_55.720311", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T02-17-55.720311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T02-17-55.720311.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_02T02_17_55.720311", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-02T02-17-55.720311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-02T02-17-55.720311.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_02T02_17_55.720311", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-02T02-17-55.720311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-02T02-17-55.720311.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_02T02_17_55.720311", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-02T02-17-55.720311.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-02T02-17-55.720311.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_02T02_17_55.720311", "path": ["**/details_harness|winogrande|5_2024-02-02T02-17-55.720311.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-02T02-17-55.720311.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_02T02_17_55.720311", "path": ["results_2024-02-02T02-17-55.720311.parquet"]}, {"split": "latest", "path": ["results_2024-02-02T02-17-55.720311.parquet"]}]}]}
2024-02-02T02:20:44+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of shadowml/BeagSake-7B Dataset automatically created during the evaluation run of model shadowml/BeagSake-7B on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-02-02T02:17:55.720311(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of shadowml/BeagSake-7B\n\n\n\nDataset automatically created during the evaluation run of model shadowml/BeagSake-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-02-02T02:17:55.720311(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of shadowml/BeagSake-7B\n\n\n\nDataset automatically created during the evaluation run of model shadowml/BeagSake-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-02-02T02:17:55.720311(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
ae2bc6e38c2e00ede5695f0c33482ed0a695cba0
# Dataset Card for Evaluation run of Gille/StrangeMerges_18-7B-dare_ties <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [Gille/StrangeMerges_18-7B-dare_ties](https://huggingface.co/Gille/StrangeMerges_18-7B-dare_ties) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_Gille__StrangeMerges_18-7B-dare_ties", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-02-02T02:26:03.631353](https://huggingface.co/datasets/open-llm-leaderboard/details_Gille__StrangeMerges_18-7B-dare_ties/blob/main/results_2024-02-02T02-26-03.631353.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.638141011794225, "acc_stderr": 0.03221763395931037, "acc_norm": 0.640122595431434, "acc_norm_stderr": 0.032859222090384846, "mc1": 0.3574051407588739, "mc1_stderr": 0.0167765996767294, "mc2": 0.521661560648742, "mc2_stderr": 0.015256495321750132 }, "harness|arc:challenge|25": { "acc": 0.6109215017064846, "acc_stderr": 0.014247309976045607, "acc_norm": 0.6407849829351536, "acc_norm_stderr": 0.014020224155839162 }, "harness|hellaswag|10": { "acc": 0.6521609241187014, "acc_stderr": 0.004753112432728698, "acc_norm": 0.8436566421031667, "acc_norm_stderr": 0.003624383120823463 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.28, "acc_stderr": 0.04512608598542128, "acc_norm": 0.28, "acc_norm_stderr": 0.04512608598542128 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.5925925925925926, "acc_stderr": 0.04244633238353227, "acc_norm": 0.5925925925925926, "acc_norm_stderr": 0.04244633238353227 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.7039473684210527, "acc_stderr": 0.03715062154998904, "acc_norm": 0.7039473684210527, "acc_norm_stderr": 0.03715062154998904 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.59, "acc_stderr": 0.049431107042371025, "acc_norm": 0.59, "acc_norm_stderr": 0.049431107042371025 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.6792452830188679, "acc_stderr": 0.028727502957880267, "acc_norm": 0.6792452830188679, "acc_norm_stderr": 0.028727502957880267 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.7777777777777778, "acc_stderr": 0.034765901043041336, "acc_norm": 0.7777777777777778, "acc_norm_stderr": 0.034765901043041336 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.46, "acc_stderr": 0.05009082659620333, "acc_norm": 0.46, "acc_norm_stderr": 0.05009082659620333 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.46, "acc_stderr": 0.05009082659620333, "acc_norm": 0.46, "acc_norm_stderr": 0.05009082659620333 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.35, "acc_stderr": 0.04793724854411019, "acc_norm": 0.35, "acc_norm_stderr": 0.04793724854411019 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6358381502890174, "acc_stderr": 0.03669072477416907, "acc_norm": 0.6358381502890174, "acc_norm_stderr": 0.03669072477416907 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.38235294117647056, "acc_stderr": 0.04835503696107223, "acc_norm": 0.38235294117647056, "acc_norm_stderr": 0.04835503696107223 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.76, "acc_stderr": 0.042923469599092816, "acc_norm": 0.76, "acc_norm_stderr": 0.042923469599092816 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.5702127659574469, "acc_stderr": 0.03236214467715564, "acc_norm": 0.5702127659574469, "acc_norm_stderr": 0.03236214467715564 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.45614035087719296, "acc_stderr": 0.046854730419077895, "acc_norm": 0.45614035087719296, "acc_norm_stderr": 0.046854730419077895 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5172413793103449, "acc_stderr": 0.04164188720169375, "acc_norm": 0.5172413793103449, "acc_norm_stderr": 0.04164188720169375 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.41534391534391535, "acc_stderr": 0.0253795249107784, "acc_norm": 0.41534391534391535, "acc_norm_stderr": 0.0253795249107784 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.4523809523809524, "acc_stderr": 0.044518079590553275, "acc_norm": 0.4523809523809524, "acc_norm_stderr": 0.044518079590553275 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.37, "acc_stderr": 0.048523658709391, "acc_norm": 0.37, "acc_norm_stderr": 0.048523658709391 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.7741935483870968, "acc_stderr": 0.023785577884181012, "acc_norm": 0.7741935483870968, "acc_norm_stderr": 0.023785577884181012 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.5024630541871922, "acc_stderr": 0.03517945038691063, "acc_norm": 0.5024630541871922, "acc_norm_stderr": 0.03517945038691063 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.66, "acc_stderr": 0.04760952285695237, "acc_norm": 0.66, "acc_norm_stderr": 0.04760952285695237 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7757575757575758, "acc_stderr": 0.032568666616811015, "acc_norm": 0.7757575757575758, "acc_norm_stderr": 0.032568666616811015 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.7929292929292929, "acc_stderr": 0.02886977846026705, "acc_norm": 0.7929292929292929, "acc_norm_stderr": 0.02886977846026705 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.8860103626943006, "acc_stderr": 0.022935144053919446, "acc_norm": 0.8860103626943006, "acc_norm_stderr": 0.022935144053919446 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.6153846153846154, "acc_stderr": 0.024666744915187208, "acc_norm": 0.6153846153846154, "acc_norm_stderr": 0.024666744915187208 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.3074074074074074, "acc_stderr": 0.02813325257881563, "acc_norm": 0.3074074074074074, "acc_norm_stderr": 0.02813325257881563 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.6848739495798319, "acc_stderr": 0.030176808288974337, "acc_norm": 0.6848739495798319, "acc_norm_stderr": 0.030176808288974337 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.33774834437086093, "acc_stderr": 0.03861557546255169, "acc_norm": 0.33774834437086093, "acc_norm_stderr": 0.03861557546255169 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8348623853211009, "acc_stderr": 0.015919557829976044, "acc_norm": 0.8348623853211009, "acc_norm_stderr": 0.015919557829976044 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.5324074074074074, "acc_stderr": 0.03402801581358966, "acc_norm": 0.5324074074074074, "acc_norm_stderr": 0.03402801581358966 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.8235294117647058, "acc_stderr": 0.026756401538078962, "acc_norm": 0.8235294117647058, "acc_norm_stderr": 0.026756401538078962 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.8059071729957806, "acc_stderr": 0.025744902532290916, "acc_norm": 0.8059071729957806, "acc_norm_stderr": 0.025744902532290916 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6816143497757847, "acc_stderr": 0.03126580522513713, "acc_norm": 0.6816143497757847, "acc_norm_stderr": 0.03126580522513713 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.7633587786259542, "acc_stderr": 0.03727673575596913, "acc_norm": 0.7633587786259542, "acc_norm_stderr": 0.03727673575596913 }, "harness|hendrycksTest-international_law|5": { "acc": 0.8099173553719008, "acc_stderr": 0.03581796951709282, "acc_norm": 0.8099173553719008, "acc_norm_stderr": 0.03581796951709282 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7777777777777778, "acc_stderr": 0.0401910747255735, "acc_norm": 0.7777777777777778, "acc_norm_stderr": 0.0401910747255735 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.8220858895705522, "acc_stderr": 0.03004735765580663, "acc_norm": 0.8220858895705522, "acc_norm_stderr": 0.03004735765580663 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.5089285714285714, "acc_stderr": 0.04745033255489123, "acc_norm": 0.5089285714285714, "acc_norm_stderr": 0.04745033255489123 }, "harness|hendrycksTest-management|5": { "acc": 0.7864077669902912, "acc_stderr": 0.040580420156460344, "acc_norm": 0.7864077669902912, "acc_norm_stderr": 0.040580420156460344 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8589743589743589, "acc_stderr": 0.02280138253459754, "acc_norm": 0.8589743589743589, "acc_norm_stderr": 0.02280138253459754 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.71, "acc_stderr": 0.04560480215720684, "acc_norm": 0.71, "acc_norm_stderr": 0.04560480215720684 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8173690932311622, "acc_stderr": 0.013816335389973136, "acc_norm": 0.8173690932311622, "acc_norm_stderr": 0.013816335389973136 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.7225433526011561, "acc_stderr": 0.024105712607754307, "acc_norm": 0.7225433526011561, "acc_norm_stderr": 0.024105712607754307 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.30614525139664805, "acc_stderr": 0.015414494487903217, "acc_norm": 0.30614525139664805, "acc_norm_stderr": 0.015414494487903217 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.738562091503268, "acc_stderr": 0.025160998214292452, "acc_norm": 0.738562091503268, "acc_norm_stderr": 0.025160998214292452 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.7041800643086816, "acc_stderr": 0.02592237178881877, "acc_norm": 0.7041800643086816, "acc_norm_stderr": 0.02592237178881877 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.7283950617283951, "acc_stderr": 0.02474862449053737, "acc_norm": 0.7283950617283951, "acc_norm_stderr": 0.02474862449053737 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.5, "acc_stderr": 0.029827499313594685, "acc_norm": 0.5, "acc_norm_stderr": 0.029827499313594685 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.46870925684485004, "acc_stderr": 0.012745204626083138, "acc_norm": 0.46870925684485004, "acc_norm_stderr": 0.012745204626083138 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.6801470588235294, "acc_stderr": 0.02833295951403121, "acc_norm": 0.6801470588235294, "acc_norm_stderr": 0.02833295951403121 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6666666666666666, "acc_stderr": 0.019070985589687492, "acc_norm": 0.6666666666666666, "acc_norm_stderr": 0.019070985589687492 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6363636363636364, "acc_stderr": 0.04607582090719976, "acc_norm": 0.6363636363636364, "acc_norm_stderr": 0.04607582090719976 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.7224489795918367, "acc_stderr": 0.028666857790274645, "acc_norm": 0.7224489795918367, "acc_norm_stderr": 0.028666857790274645 }, "harness|hendrycksTest-sociology|5": { "acc": 0.8308457711442786, "acc_stderr": 0.02650859065623327, "acc_norm": 0.8308457711442786, "acc_norm_stderr": 0.02650859065623327 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.87, "acc_stderr": 0.033799766898963086, "acc_norm": 0.87, "acc_norm_stderr": 0.033799766898963086 }, "harness|hendrycksTest-virology|5": { "acc": 0.5602409638554217, "acc_stderr": 0.03864139923699122, "acc_norm": 0.5602409638554217, "acc_norm_stderr": 0.03864139923699122 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8245614035087719, "acc_stderr": 0.029170885500727665, "acc_norm": 0.8245614035087719, "acc_norm_stderr": 0.029170885500727665 }, "harness|truthfulqa:mc|0": { "mc1": 0.3574051407588739, "mc1_stderr": 0.0167765996767294, "mc2": 0.521661560648742, "mc2_stderr": 0.015256495321750132 }, "harness|winogrande|5": { "acc": 0.7726913970007893, "acc_stderr": 0.011778612167091087 }, "harness|gsm8k|5": { "acc": 0.6080363912054587, "acc_stderr": 0.013447140886023817 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_Gille__StrangeMerges_18-7B-dare_ties
[ "region:us" ]
2024-02-02T02:28:26+00:00
{"pretty_name": "Evaluation run of Gille/StrangeMerges_18-7B-dare_ties", "dataset_summary": "Dataset automatically created during the evaluation run of model [Gille/StrangeMerges_18-7B-dare_ties](https://huggingface.co/Gille/StrangeMerges_18-7B-dare_ties) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Gille__StrangeMerges_18-7B-dare_ties\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-02T02:26:03.631353](https://huggingface.co/datasets/open-llm-leaderboard/details_Gille__StrangeMerges_18-7B-dare_ties/blob/main/results_2024-02-02T02-26-03.631353.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.638141011794225,\n \"acc_stderr\": 0.03221763395931037,\n \"acc_norm\": 0.640122595431434,\n \"acc_norm_stderr\": 0.032859222090384846,\n \"mc1\": 0.3574051407588739,\n \"mc1_stderr\": 0.0167765996767294,\n \"mc2\": 0.521661560648742,\n \"mc2_stderr\": 0.015256495321750132\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6109215017064846,\n \"acc_stderr\": 0.014247309976045607,\n \"acc_norm\": 0.6407849829351536,\n \"acc_norm_stderr\": 0.014020224155839162\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6521609241187014,\n \"acc_stderr\": 0.004753112432728698,\n \"acc_norm\": 0.8436566421031667,\n \"acc_norm_stderr\": 0.003624383120823463\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5925925925925926,\n \"acc_stderr\": 0.04244633238353227,\n \"acc_norm\": 0.5925925925925926,\n \"acc_norm_stderr\": 0.04244633238353227\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7039473684210527,\n \"acc_stderr\": 0.03715062154998904,\n \"acc_norm\": 0.7039473684210527,\n \"acc_norm_stderr\": 0.03715062154998904\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.59,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.59,\n \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6792452830188679,\n \"acc_stderr\": 0.028727502957880267,\n \"acc_norm\": 0.6792452830188679,\n \"acc_norm_stderr\": 0.028727502957880267\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.034765901043041336,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.034765901043041336\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.04793724854411019,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.04793724854411019\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6358381502890174,\n \"acc_stderr\": 0.03669072477416907,\n \"acc_norm\": 0.6358381502890174,\n \"acc_norm_stderr\": 0.03669072477416907\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107223,\n \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107223\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5702127659574469,\n \"acc_stderr\": 0.03236214467715564,\n \"acc_norm\": 0.5702127659574469,\n \"acc_norm_stderr\": 0.03236214467715564\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.45614035087719296,\n \"acc_stderr\": 0.046854730419077895,\n \"acc_norm\": 0.45614035087719296,\n \"acc_norm_stderr\": 0.046854730419077895\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.04164188720169375,\n \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.04164188720169375\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.41534391534391535,\n \"acc_stderr\": 0.0253795249107784,\n \"acc_norm\": 0.41534391534391535,\n \"acc_norm_stderr\": 0.0253795249107784\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4523809523809524,\n \"acc_stderr\": 0.044518079590553275,\n \"acc_norm\": 0.4523809523809524,\n \"acc_norm_stderr\": 0.044518079590553275\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7741935483870968,\n \"acc_stderr\": 0.023785577884181012,\n \"acc_norm\": 0.7741935483870968,\n \"acc_norm_stderr\": 0.023785577884181012\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5024630541871922,\n \"acc_stderr\": 0.03517945038691063,\n \"acc_norm\": 0.5024630541871922,\n \"acc_norm_stderr\": 0.03517945038691063\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\": 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.032568666616811015,\n \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.032568666616811015\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7929292929292929,\n \"acc_stderr\": 0.02886977846026705,\n \"acc_norm\": 0.7929292929292929,\n \"acc_norm_stderr\": 0.02886977846026705\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8860103626943006,\n \"acc_stderr\": 0.022935144053919446,\n \"acc_norm\": 0.8860103626943006,\n \"acc_norm_stderr\": 0.022935144053919446\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6153846153846154,\n \"acc_stderr\": 0.024666744915187208,\n \"acc_norm\": 0.6153846153846154,\n \"acc_norm_stderr\": 0.024666744915187208\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3074074074074074,\n \"acc_stderr\": 0.02813325257881563,\n \"acc_norm\": 0.3074074074074074,\n \"acc_norm_stderr\": 0.02813325257881563\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6848739495798319,\n \"acc_stderr\": 0.030176808288974337,\n \"acc_norm\": 0.6848739495798319,\n \"acc_norm_stderr\": 0.030176808288974337\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.33774834437086093,\n \"acc_stderr\": 0.03861557546255169,\n \"acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.03861557546255169\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8348623853211009,\n \"acc_stderr\": 0.015919557829976044,\n \"acc_norm\": 0.8348623853211009,\n \"acc_norm_stderr\": 0.015919557829976044\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5324074074074074,\n \"acc_stderr\": 0.03402801581358966,\n \"acc_norm\": 0.5324074074074074,\n \"acc_norm_stderr\": 0.03402801581358966\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8235294117647058,\n \"acc_stderr\": 0.026756401538078962,\n \"acc_norm\": 0.8235294117647058,\n \"acc_norm_stderr\": 0.026756401538078962\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8059071729957806,\n \"acc_stderr\": 0.025744902532290916,\n \"acc_norm\": 0.8059071729957806,\n \"acc_norm_stderr\": 0.025744902532290916\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6816143497757847,\n \"acc_stderr\": 0.03126580522513713,\n \"acc_norm\": 0.6816143497757847,\n \"acc_norm_stderr\": 0.03126580522513713\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7633587786259542,\n \"acc_stderr\": 0.03727673575596913,\n \"acc_norm\": 0.7633587786259542,\n \"acc_norm_stderr\": 0.03727673575596913\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8099173553719008,\n \"acc_stderr\": 0.03581796951709282,\n \"acc_norm\": 0.8099173553719008,\n \"acc_norm_stderr\": 0.03581796951709282\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.0401910747255735,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.0401910747255735\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.8220858895705522,\n \"acc_stderr\": 0.03004735765580663,\n \"acc_norm\": 0.8220858895705522,\n \"acc_norm_stderr\": 0.03004735765580663\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5089285714285714,\n \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.5089285714285714,\n \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7864077669902912,\n \"acc_stderr\": 0.040580420156460344,\n \"acc_norm\": 0.7864077669902912,\n \"acc_norm_stderr\": 0.040580420156460344\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8589743589743589,\n \"acc_stderr\": 0.02280138253459754,\n \"acc_norm\": 0.8589743589743589,\n \"acc_norm_stderr\": 0.02280138253459754\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.04560480215720684,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.04560480215720684\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8173690932311622,\n \"acc_stderr\": 0.013816335389973136,\n \"acc_norm\": 0.8173690932311622,\n \"acc_norm_stderr\": 0.013816335389973136\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7225433526011561,\n \"acc_stderr\": 0.024105712607754307,\n \"acc_norm\": 0.7225433526011561,\n \"acc_norm_stderr\": 0.024105712607754307\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.30614525139664805,\n \"acc_stderr\": 0.015414494487903217,\n \"acc_norm\": 0.30614525139664805,\n \"acc_norm_stderr\": 0.015414494487903217\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.738562091503268,\n \"acc_stderr\": 0.025160998214292452,\n \"acc_norm\": 0.738562091503268,\n \"acc_norm_stderr\": 0.025160998214292452\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7041800643086816,\n \"acc_stderr\": 0.02592237178881877,\n \"acc_norm\": 0.7041800643086816,\n \"acc_norm_stderr\": 0.02592237178881877\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7283950617283951,\n \"acc_stderr\": 0.02474862449053737,\n \"acc_norm\": 0.7283950617283951,\n \"acc_norm_stderr\": 0.02474862449053737\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.029827499313594685,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.029827499313594685\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46870925684485004,\n \"acc_stderr\": 0.012745204626083138,\n \"acc_norm\": 0.46870925684485004,\n \"acc_norm_stderr\": 0.012745204626083138\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6801470588235294,\n \"acc_stderr\": 0.02833295951403121,\n \"acc_norm\": 0.6801470588235294,\n \"acc_norm_stderr\": 0.02833295951403121\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.019070985589687492,\n \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.019070985589687492\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6363636363636364,\n \"acc_stderr\": 0.04607582090719976,\n \"acc_norm\": 0.6363636363636364,\n \"acc_norm_stderr\": 0.04607582090719976\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7224489795918367,\n \"acc_stderr\": 0.028666857790274645,\n \"acc_norm\": 0.7224489795918367,\n \"acc_norm_stderr\": 0.028666857790274645\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8308457711442786,\n \"acc_stderr\": 0.02650859065623327,\n \"acc_norm\": 0.8308457711442786,\n \"acc_norm_stderr\": 0.02650859065623327\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.87,\n \"acc_stderr\": 0.033799766898963086,\n \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.033799766898963086\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5602409638554217,\n \"acc_stderr\": 0.03864139923699122,\n \"acc_norm\": 0.5602409638554217,\n \"acc_norm_stderr\": 0.03864139923699122\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8245614035087719,\n \"acc_stderr\": 0.029170885500727665,\n \"acc_norm\": 0.8245614035087719,\n \"acc_norm_stderr\": 0.029170885500727665\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3574051407588739,\n \"mc1_stderr\": 0.0167765996767294,\n \"mc2\": 0.521661560648742,\n \"mc2_stderr\": 0.015256495321750132\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7726913970007893,\n \"acc_stderr\": 0.011778612167091087\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6080363912054587,\n \"acc_stderr\": 0.013447140886023817\n }\n}\n```", "repo_url": "https://huggingface.co/Gille/StrangeMerges_18-7B-dare_ties", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_02T02_26_03.631353", "path": ["**/details_harness|arc:challenge|25_2024-02-02T02-26-03.631353.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-02T02-26-03.631353.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_02T02_26_03.631353", "path": ["**/details_harness|gsm8k|5_2024-02-02T02-26-03.631353.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-02T02-26-03.631353.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_02T02_26_03.631353", "path": ["**/details_harness|hellaswag|10_2024-02-02T02-26-03.631353.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-02T02-26-03.631353.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_02T02_26_03.631353", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T02-26-03.631353.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-02T02-26-03.631353.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-02T02-26-03.631353.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T02-26-03.631353.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T02-26-03.631353.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-02T02-26-03.631353.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T02-26-03.631353.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T02-26-03.631353.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T02-26-03.631353.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T02-26-03.631353.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-02T02-26-03.631353.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-02T02-26-03.631353.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T02-26-03.631353.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-02T02-26-03.631353.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T02-26-03.631353.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T02-26-03.631353.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T02-26-03.631353.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-02T02-26-03.631353.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T02-26-03.631353.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T02-26-03.631353.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T02-26-03.631353.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T02-26-03.631353.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T02-26-03.631353.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T02-26-03.631353.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T02-26-03.631353.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T02-26-03.631353.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T02-26-03.631353.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T02-26-03.631353.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T02-26-03.631353.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T02-26-03.631353.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T02-26-03.631353.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T02-26-03.631353.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-02T02-26-03.631353.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T02-26-03.631353.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-02T02-26-03.631353.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T02-26-03.631353.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T02-26-03.631353.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T02-26-03.631353.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-02T02-26-03.631353.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-02T02-26-03.631353.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T02-26-03.631353.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T02-26-03.631353.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T02-26-03.631353.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T02-26-03.631353.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-02T02-26-03.631353.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-02T02-26-03.631353.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-02T02-26-03.631353.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T02-26-03.631353.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-02T02-26-03.631353.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T02-26-03.631353.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T02-26-03.631353.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-02T02-26-03.631353.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-02T02-26-03.631353.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-02T02-26-03.631353.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T02-26-03.631353.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-02T02-26-03.631353.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-02T02-26-03.631353.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T02-26-03.631353.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-02T02-26-03.631353.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-02T02-26-03.631353.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T02-26-03.631353.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T02-26-03.631353.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-02T02-26-03.631353.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T02-26-03.631353.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T02-26-03.631353.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T02-26-03.631353.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T02-26-03.631353.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-02T02-26-03.631353.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-02T02-26-03.631353.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T02-26-03.631353.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-02T02-26-03.631353.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T02-26-03.631353.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T02-26-03.631353.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T02-26-03.631353.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-02T02-26-03.631353.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T02-26-03.631353.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T02-26-03.631353.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T02-26-03.631353.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T02-26-03.631353.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T02-26-03.631353.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T02-26-03.631353.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T02-26-03.631353.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T02-26-03.631353.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T02-26-03.631353.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T02-26-03.631353.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T02-26-03.631353.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T02-26-03.631353.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T02-26-03.631353.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T02-26-03.631353.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-02T02-26-03.631353.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T02-26-03.631353.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-02T02-26-03.631353.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T02-26-03.631353.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T02-26-03.631353.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T02-26-03.631353.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-02T02-26-03.631353.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-02T02-26-03.631353.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T02-26-03.631353.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T02-26-03.631353.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T02-26-03.631353.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T02-26-03.631353.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-02T02-26-03.631353.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-02T02-26-03.631353.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-02T02-26-03.631353.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T02-26-03.631353.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-02T02-26-03.631353.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T02-26-03.631353.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T02-26-03.631353.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-02T02-26-03.631353.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-02T02-26-03.631353.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-02T02-26-03.631353.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T02-26-03.631353.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-02T02-26-03.631353.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-02T02-26-03.631353.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_02T02_26_03.631353", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T02-26-03.631353.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T02-26-03.631353.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_02T02_26_03.631353", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-02T02-26-03.631353.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-02T02-26-03.631353.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_02T02_26_03.631353", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-02T02-26-03.631353.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-02T02-26-03.631353.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_02T02_26_03.631353", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T02-26-03.631353.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T02-26-03.631353.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_02T02_26_03.631353", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T02-26-03.631353.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T02-26-03.631353.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_02T02_26_03.631353", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-02T02-26-03.631353.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-02T02-26-03.631353.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_02T02_26_03.631353", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T02-26-03.631353.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T02-26-03.631353.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_02T02_26_03.631353", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T02-26-03.631353.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T02-26-03.631353.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_02T02_26_03.631353", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T02-26-03.631353.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T02-26-03.631353.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_02T02_26_03.631353", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T02-26-03.631353.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T02-26-03.631353.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_02T02_26_03.631353", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-02T02-26-03.631353.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-02T02-26-03.631353.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_02T02_26_03.631353", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-02T02-26-03.631353.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-02T02-26-03.631353.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_02T02_26_03.631353", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T02-26-03.631353.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T02-26-03.631353.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_02T02_26_03.631353", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-02T02-26-03.631353.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-02T02-26-03.631353.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_02T02_26_03.631353", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T02-26-03.631353.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T02-26-03.631353.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_02T02_26_03.631353", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T02-26-03.631353.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T02-26-03.631353.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_02T02_26_03.631353", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T02-26-03.631353.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T02-26-03.631353.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_02T02_26_03.631353", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-02T02-26-03.631353.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-02T02-26-03.631353.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_02T02_26_03.631353", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T02-26-03.631353.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T02-26-03.631353.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_02T02_26_03.631353", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T02-26-03.631353.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T02-26-03.631353.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_02T02_26_03.631353", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T02-26-03.631353.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T02-26-03.631353.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_02T02_26_03.631353", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T02-26-03.631353.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T02-26-03.631353.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_02T02_26_03.631353", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T02-26-03.631353.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T02-26-03.631353.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_02T02_26_03.631353", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T02-26-03.631353.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T02-26-03.631353.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_02T02_26_03.631353", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T02-26-03.631353.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T02-26-03.631353.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_02T02_26_03.631353", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T02-26-03.631353.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T02-26-03.631353.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_02T02_26_03.631353", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T02-26-03.631353.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T02-26-03.631353.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_02T02_26_03.631353", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T02-26-03.631353.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T02-26-03.631353.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_02T02_26_03.631353", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T02-26-03.631353.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T02-26-03.631353.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_02T02_26_03.631353", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T02-26-03.631353.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T02-26-03.631353.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_02T02_26_03.631353", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T02-26-03.631353.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T02-26-03.631353.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_02T02_26_03.631353", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T02-26-03.631353.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T02-26-03.631353.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_02T02_26_03.631353", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-02T02-26-03.631353.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-02T02-26-03.631353.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_02T02_26_03.631353", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T02-26-03.631353.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T02-26-03.631353.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_02T02_26_03.631353", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-02T02-26-03.631353.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-02T02-26-03.631353.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_02T02_26_03.631353", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T02-26-03.631353.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T02-26-03.631353.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_02T02_26_03.631353", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T02-26-03.631353.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T02-26-03.631353.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_02T02_26_03.631353", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T02-26-03.631353.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T02-26-03.631353.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_02T02_26_03.631353", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-02T02-26-03.631353.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-02T02-26-03.631353.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_02T02_26_03.631353", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-02T02-26-03.631353.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-02T02-26-03.631353.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_02T02_26_03.631353", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T02-26-03.631353.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T02-26-03.631353.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_02T02_26_03.631353", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T02-26-03.631353.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T02-26-03.631353.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_02T02_26_03.631353", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T02-26-03.631353.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T02-26-03.631353.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_02T02_26_03.631353", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T02-26-03.631353.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T02-26-03.631353.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_02T02_26_03.631353", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-02T02-26-03.631353.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-02T02-26-03.631353.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_02T02_26_03.631353", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-02T02-26-03.631353.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-02T02-26-03.631353.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_02T02_26_03.631353", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-02T02-26-03.631353.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-02T02-26-03.631353.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_02T02_26_03.631353", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T02-26-03.631353.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T02-26-03.631353.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_02T02_26_03.631353", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-02T02-26-03.631353.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-02T02-26-03.631353.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_02T02_26_03.631353", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T02-26-03.631353.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T02-26-03.631353.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_02T02_26_03.631353", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T02-26-03.631353.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T02-26-03.631353.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_02T02_26_03.631353", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-02T02-26-03.631353.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-02T02-26-03.631353.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_02T02_26_03.631353", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-02T02-26-03.631353.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-02T02-26-03.631353.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_02T02_26_03.631353", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-02T02-26-03.631353.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-02T02-26-03.631353.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_02T02_26_03.631353", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T02-26-03.631353.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T02-26-03.631353.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_02T02_26_03.631353", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-02T02-26-03.631353.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-02T02-26-03.631353.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_02T02_26_03.631353", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-02T02-26-03.631353.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-02T02-26-03.631353.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_02T02_26_03.631353", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-02T02-26-03.631353.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-02T02-26-03.631353.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_02T02_26_03.631353", "path": ["**/details_harness|winogrande|5_2024-02-02T02-26-03.631353.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-02T02-26-03.631353.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_02T02_26_03.631353", "path": ["results_2024-02-02T02-26-03.631353.parquet"]}, {"split": "latest", "path": ["results_2024-02-02T02-26-03.631353.parquet"]}]}]}
2024-02-02T02:28:55+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of Gille/StrangeMerges_18-7B-dare_ties Dataset automatically created during the evaluation run of model Gille/StrangeMerges_18-7B-dare_ties on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-02-02T02:26:03.631353(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of Gille/StrangeMerges_18-7B-dare_ties\n\n\n\nDataset automatically created during the evaluation run of model Gille/StrangeMerges_18-7B-dare_ties on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-02-02T02:26:03.631353(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of Gille/StrangeMerges_18-7B-dare_ties\n\n\n\nDataset automatically created during the evaluation run of model Gille/StrangeMerges_18-7B-dare_ties on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-02-02T02:26:03.631353(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
9b18f4cd07d81ec0bceeab0356e547c9d1b923a7
# Dataset Card for Evaluation run of Gille/StrangeMerges_4-7B-slerp <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [Gille/StrangeMerges_4-7B-slerp](https://huggingface.co/Gille/StrangeMerges_4-7B-slerp) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_Gille__StrangeMerges_4-7B-slerp", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-02-02T02:32:53.668872](https://huggingface.co/datasets/open-llm-leaderboard/details_Gille__StrangeMerges_4-7B-slerp/blob/main/results_2024-02-02T02-32-53.668872.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6570402657136609, "acc_stderr": 0.03181426103850339, "acc_norm": 0.6576418831798367, "acc_norm_stderr": 0.03246662892084514, "mc1": 0.45532435740514077, "mc1_stderr": 0.017433490102538772, "mc2": 0.6240238096985373, "mc2_stderr": 0.0150858230636782 }, "harness|arc:challenge|25": { "acc": 0.6501706484641638, "acc_stderr": 0.013936809212158285, "acc_norm": 0.6945392491467577, "acc_norm_stderr": 0.01346008047800251 }, "harness|hellaswag|10": { "acc": 0.6774546903007369, "acc_stderr": 0.004664950168300713, "acc_norm": 0.8701453893646683, "acc_norm_stderr": 0.003354564257491871 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.34, "acc_stderr": 0.047609522856952365, "acc_norm": 0.34, "acc_norm_stderr": 0.047609522856952365 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.6370370370370371, "acc_stderr": 0.04153948404742398, "acc_norm": 0.6370370370370371, "acc_norm_stderr": 0.04153948404742398 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.7039473684210527, "acc_stderr": 0.03715062154998904, "acc_norm": 0.7039473684210527, "acc_norm_stderr": 0.03715062154998904 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.63, "acc_stderr": 0.04852365870939099, "acc_norm": 0.63, "acc_norm_stderr": 0.04852365870939099 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.7132075471698113, "acc_stderr": 0.02783491252754406, "acc_norm": 0.7132075471698113, "acc_norm_stderr": 0.02783491252754406 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.7708333333333334, "acc_stderr": 0.03514697467862388, "acc_norm": 0.7708333333333334, "acc_norm_stderr": 0.03514697467862388 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.46, "acc_stderr": 0.05009082659620333, "acc_norm": 0.46, "acc_norm_stderr": 0.05009082659620333 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.54, "acc_stderr": 0.05009082659620333, "acc_norm": 0.54, "acc_norm_stderr": 0.05009082659620333 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.3, "acc_stderr": 0.046056618647183814, "acc_norm": 0.3, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6878612716763006, "acc_stderr": 0.035331333893236574, "acc_norm": 0.6878612716763006, "acc_norm_stderr": 0.035331333893236574 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.4215686274509804, "acc_stderr": 0.04913595201274498, "acc_norm": 0.4215686274509804, "acc_norm_stderr": 0.04913595201274498 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.8, "acc_stderr": 0.04020151261036845, "acc_norm": 0.8, "acc_norm_stderr": 0.04020151261036845 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.5957446808510638, "acc_stderr": 0.03208115750788684, "acc_norm": 0.5957446808510638, "acc_norm_stderr": 0.03208115750788684 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.4824561403508772, "acc_stderr": 0.04700708033551038, "acc_norm": 0.4824561403508772, "acc_norm_stderr": 0.04700708033551038 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5724137931034483, "acc_stderr": 0.04122737111370332, "acc_norm": 0.5724137931034483, "acc_norm_stderr": 0.04122737111370332 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.42857142857142855, "acc_stderr": 0.02548718714785938, "acc_norm": 0.42857142857142855, "acc_norm_stderr": 0.02548718714785938 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.46825396825396826, "acc_stderr": 0.04463112720677172, "acc_norm": 0.46825396825396826, "acc_norm_stderr": 0.04463112720677172 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.36, "acc_stderr": 0.048241815132442176, "acc_norm": 0.36, "acc_norm_stderr": 0.048241815132442176 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.7903225806451613, "acc_stderr": 0.023157879349083522, "acc_norm": 0.7903225806451613, "acc_norm_stderr": 0.023157879349083522 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.4827586206896552, "acc_stderr": 0.035158955511657, "acc_norm": 0.4827586206896552, "acc_norm_stderr": 0.035158955511657 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.72, "acc_stderr": 0.04512608598542127, "acc_norm": 0.72, "acc_norm_stderr": 0.04512608598542127 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.8, "acc_stderr": 0.031234752377721175, "acc_norm": 0.8, "acc_norm_stderr": 0.031234752377721175 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.797979797979798, "acc_stderr": 0.028606204289229872, "acc_norm": 0.797979797979798, "acc_norm_stderr": 0.028606204289229872 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.9015544041450777, "acc_stderr": 0.021500249576033456, "acc_norm": 0.9015544041450777, "acc_norm_stderr": 0.021500249576033456 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.6820512820512821, "acc_stderr": 0.02361088430892786, "acc_norm": 0.6820512820512821, "acc_norm_stderr": 0.02361088430892786 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.34074074074074073, "acc_stderr": 0.028897748741131147, "acc_norm": 0.34074074074074073, "acc_norm_stderr": 0.028897748741131147 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.6890756302521008, "acc_stderr": 0.03006676158297794, "acc_norm": 0.6890756302521008, "acc_norm_stderr": 0.03006676158297794 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.3443708609271523, "acc_stderr": 0.038796870240733264, "acc_norm": 0.3443708609271523, "acc_norm_stderr": 0.038796870240733264 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8458715596330275, "acc_stderr": 0.015480826865374303, "acc_norm": 0.8458715596330275, "acc_norm_stderr": 0.015480826865374303 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.5092592592592593, "acc_stderr": 0.034093869469927006, "acc_norm": 0.5092592592592593, "acc_norm_stderr": 0.034093869469927006 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.8382352941176471, "acc_stderr": 0.025845017986926917, "acc_norm": 0.8382352941176471, "acc_norm_stderr": 0.025845017986926917 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.8059071729957806, "acc_stderr": 0.02574490253229092, "acc_norm": 0.8059071729957806, "acc_norm_stderr": 0.02574490253229092 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6905829596412556, "acc_stderr": 0.03102441174057221, "acc_norm": 0.6905829596412556, "acc_norm_stderr": 0.03102441174057221 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.8091603053435115, "acc_stderr": 0.03446513350752599, "acc_norm": 0.8091603053435115, "acc_norm_stderr": 0.03446513350752599 }, "harness|hendrycksTest-international_law|5": { "acc": 0.8016528925619835, "acc_stderr": 0.03640118271990946, "acc_norm": 0.8016528925619835, "acc_norm_stderr": 0.03640118271990946 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7777777777777778, "acc_stderr": 0.040191074725573483, "acc_norm": 0.7777777777777778, "acc_norm_stderr": 0.040191074725573483 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7791411042944786, "acc_stderr": 0.03259177392742178, "acc_norm": 0.7791411042944786, "acc_norm_stderr": 0.03259177392742178 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.48214285714285715, "acc_stderr": 0.047427623612430116, "acc_norm": 0.48214285714285715, "acc_norm_stderr": 0.047427623612430116 }, "harness|hendrycksTest-management|5": { "acc": 0.7864077669902912, "acc_stderr": 0.040580420156460344, "acc_norm": 0.7864077669902912, "acc_norm_stderr": 0.040580420156460344 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8717948717948718, "acc_stderr": 0.02190190511507333, "acc_norm": 0.8717948717948718, "acc_norm_stderr": 0.02190190511507333 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.73, "acc_stderr": 0.044619604333847394, "acc_norm": 0.73, "acc_norm_stderr": 0.044619604333847394 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8339719029374202, "acc_stderr": 0.013306478243066302, "acc_norm": 0.8339719029374202, "acc_norm_stderr": 0.013306478243066302 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.7514450867052023, "acc_stderr": 0.023267528432100174, "acc_norm": 0.7514450867052023, "acc_norm_stderr": 0.023267528432100174 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.4, "acc_stderr": 0.01638463841038082, "acc_norm": 0.4, "acc_norm_stderr": 0.01638463841038082 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.7352941176470589, "acc_stderr": 0.02526169121972948, "acc_norm": 0.7352941176470589, "acc_norm_stderr": 0.02526169121972948 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.7234726688102894, "acc_stderr": 0.02540383297817961, "acc_norm": 0.7234726688102894, "acc_norm_stderr": 0.02540383297817961 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.7592592592592593, "acc_stderr": 0.023788583551658533, "acc_norm": 0.7592592592592593, "acc_norm_stderr": 0.023788583551658533 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.49645390070921985, "acc_stderr": 0.02982674915328092, "acc_norm": 0.49645390070921985, "acc_norm_stderr": 0.02982674915328092 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.4634941329856584, "acc_stderr": 0.012736153390214961, "acc_norm": 0.4634941329856584, "acc_norm_stderr": 0.012736153390214961 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.6764705882352942, "acc_stderr": 0.028418208619406755, "acc_norm": 0.6764705882352942, "acc_norm_stderr": 0.028418208619406755 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6830065359477124, "acc_stderr": 0.018824219512706207, "acc_norm": 0.6830065359477124, "acc_norm_stderr": 0.018824219512706207 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6818181818181818, "acc_stderr": 0.04461272175910509, "acc_norm": 0.6818181818181818, "acc_norm_stderr": 0.04461272175910509 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.7346938775510204, "acc_stderr": 0.028263889943784596, "acc_norm": 0.7346938775510204, "acc_norm_stderr": 0.028263889943784596 }, "harness|hendrycksTest-sociology|5": { "acc": 0.8557213930348259, "acc_stderr": 0.024845753212306046, "acc_norm": 0.8557213930348259, "acc_norm_stderr": 0.024845753212306046 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.88, "acc_stderr": 0.03265986323710906, "acc_norm": 0.88, "acc_norm_stderr": 0.03265986323710906 }, "harness|hendrycksTest-virology|5": { "acc": 0.536144578313253, "acc_stderr": 0.038823108508905954, "acc_norm": 0.536144578313253, "acc_norm_stderr": 0.038823108508905954 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8362573099415205, "acc_stderr": 0.028380919596145866, "acc_norm": 0.8362573099415205, "acc_norm_stderr": 0.028380919596145866 }, "harness|truthfulqa:mc|0": { "mc1": 0.45532435740514077, "mc1_stderr": 0.017433490102538772, "mc2": 0.6240238096985373, "mc2_stderr": 0.0150858230636782 }, "harness|winogrande|5": { "acc": 0.829518547750592, "acc_stderr": 0.010569021122825909 }, "harness|gsm8k|5": { "acc": 0.686125852918878, "acc_stderr": 0.012782681251053201 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_Gille__StrangeMerges_4-7B-slerp
[ "region:us" ]
2024-02-02T02:35:13+00:00
{"pretty_name": "Evaluation run of Gille/StrangeMerges_4-7B-slerp", "dataset_summary": "Dataset automatically created during the evaluation run of model [Gille/StrangeMerges_4-7B-slerp](https://huggingface.co/Gille/StrangeMerges_4-7B-slerp) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Gille__StrangeMerges_4-7B-slerp\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-02T02:32:53.668872](https://huggingface.co/datasets/open-llm-leaderboard/details_Gille__StrangeMerges_4-7B-slerp/blob/main/results_2024-02-02T02-32-53.668872.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6570402657136609,\n \"acc_stderr\": 0.03181426103850339,\n \"acc_norm\": 0.6576418831798367,\n \"acc_norm_stderr\": 0.03246662892084514,\n \"mc1\": 0.45532435740514077,\n \"mc1_stderr\": 0.017433490102538772,\n \"mc2\": 0.6240238096985373,\n \"mc2_stderr\": 0.0150858230636782\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6501706484641638,\n \"acc_stderr\": 0.013936809212158285,\n \"acc_norm\": 0.6945392491467577,\n \"acc_norm_stderr\": 0.01346008047800251\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6774546903007369,\n \"acc_stderr\": 0.004664950168300713,\n \"acc_norm\": 0.8701453893646683,\n \"acc_norm_stderr\": 0.003354564257491871\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.047609522856952365,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.047609522856952365\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6370370370370371,\n \"acc_stderr\": 0.04153948404742398,\n \"acc_norm\": 0.6370370370370371,\n \"acc_norm_stderr\": 0.04153948404742398\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7039473684210527,\n \"acc_stderr\": 0.03715062154998904,\n \"acc_norm\": 0.7039473684210527,\n \"acc_norm_stderr\": 0.03715062154998904\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.63,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7132075471698113,\n \"acc_stderr\": 0.02783491252754406,\n \"acc_norm\": 0.7132075471698113,\n \"acc_norm_stderr\": 0.02783491252754406\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7708333333333334,\n \"acc_stderr\": 0.03514697467862388,\n \"acc_norm\": 0.7708333333333334,\n \"acc_norm_stderr\": 0.03514697467862388\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6878612716763006,\n \"acc_stderr\": 0.035331333893236574,\n \"acc_norm\": 0.6878612716763006,\n \"acc_norm_stderr\": 0.035331333893236574\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4215686274509804,\n \"acc_stderr\": 0.04913595201274498,\n \"acc_norm\": 0.4215686274509804,\n \"acc_norm_stderr\": 0.04913595201274498\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.8,\n \"acc_stderr\": 0.04020151261036845,\n \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.04020151261036845\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5957446808510638,\n \"acc_stderr\": 0.03208115750788684,\n \"acc_norm\": 0.5957446808510638,\n \"acc_norm_stderr\": 0.03208115750788684\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.4824561403508772,\n \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5724137931034483,\n \"acc_stderr\": 0.04122737111370332,\n \"acc_norm\": 0.5724137931034483,\n \"acc_norm_stderr\": 0.04122737111370332\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.42857142857142855,\n \"acc_stderr\": 0.02548718714785938,\n \"acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.02548718714785938\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.46825396825396826,\n \"acc_stderr\": 0.04463112720677172,\n \"acc_norm\": 0.46825396825396826,\n \"acc_norm_stderr\": 0.04463112720677172\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7903225806451613,\n \"acc_stderr\": 0.023157879349083522,\n \"acc_norm\": 0.7903225806451613,\n \"acc_norm_stderr\": 0.023157879349083522\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4827586206896552,\n \"acc_stderr\": 0.035158955511657,\n \"acc_norm\": 0.4827586206896552,\n \"acc_norm_stderr\": 0.035158955511657\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.8,\n \"acc_stderr\": 0.031234752377721175,\n \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.031234752377721175\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.797979797979798,\n \"acc_stderr\": 0.028606204289229872,\n \"acc_norm\": 0.797979797979798,\n \"acc_norm_stderr\": 0.028606204289229872\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9015544041450777,\n \"acc_stderr\": 0.021500249576033456,\n \"acc_norm\": 0.9015544041450777,\n \"acc_norm_stderr\": 0.021500249576033456\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6820512820512821,\n \"acc_stderr\": 0.02361088430892786,\n \"acc_norm\": 0.6820512820512821,\n \"acc_norm_stderr\": 0.02361088430892786\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.34074074074074073,\n \"acc_stderr\": 0.028897748741131147,\n \"acc_norm\": 0.34074074074074073,\n \"acc_norm_stderr\": 0.028897748741131147\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6890756302521008,\n \"acc_stderr\": 0.03006676158297794,\n \"acc_norm\": 0.6890756302521008,\n \"acc_norm_stderr\": 0.03006676158297794\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3443708609271523,\n \"acc_stderr\": 0.038796870240733264,\n \"acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.038796870240733264\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8458715596330275,\n \"acc_stderr\": 0.015480826865374303,\n \"acc_norm\": 0.8458715596330275,\n \"acc_norm_stderr\": 0.015480826865374303\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5092592592592593,\n \"acc_stderr\": 0.034093869469927006,\n \"acc_norm\": 0.5092592592592593,\n \"acc_norm_stderr\": 0.034093869469927006\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8382352941176471,\n \"acc_stderr\": 0.025845017986926917,\n \"acc_norm\": 0.8382352941176471,\n \"acc_norm_stderr\": 0.025845017986926917\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8059071729957806,\n \"acc_stderr\": 0.02574490253229092,\n \"acc_norm\": 0.8059071729957806,\n \"acc_norm_stderr\": 0.02574490253229092\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n \"acc_stderr\": 0.03102441174057221,\n \"acc_norm\": 0.6905829596412556,\n \"acc_norm_stderr\": 0.03102441174057221\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8091603053435115,\n \"acc_stderr\": 0.03446513350752599,\n \"acc_norm\": 0.8091603053435115,\n \"acc_norm_stderr\": 0.03446513350752599\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8016528925619835,\n \"acc_stderr\": 0.03640118271990946,\n \"acc_norm\": 0.8016528925619835,\n \"acc_norm_stderr\": 0.03640118271990946\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.040191074725573483,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.040191074725573483\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7791411042944786,\n \"acc_stderr\": 0.03259177392742178,\n \"acc_norm\": 0.7791411042944786,\n \"acc_norm_stderr\": 0.03259177392742178\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.48214285714285715,\n \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.48214285714285715,\n \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7864077669902912,\n \"acc_stderr\": 0.040580420156460344,\n \"acc_norm\": 0.7864077669902912,\n \"acc_norm_stderr\": 0.040580420156460344\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8717948717948718,\n \"acc_stderr\": 0.02190190511507333,\n \"acc_norm\": 0.8717948717948718,\n \"acc_norm_stderr\": 0.02190190511507333\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8339719029374202,\n \"acc_stderr\": 0.013306478243066302,\n \"acc_norm\": 0.8339719029374202,\n \"acc_norm_stderr\": 0.013306478243066302\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7514450867052023,\n \"acc_stderr\": 0.023267528432100174,\n \"acc_norm\": 0.7514450867052023,\n \"acc_norm_stderr\": 0.023267528432100174\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4,\n \"acc_stderr\": 0.01638463841038082,\n \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.01638463841038082\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7352941176470589,\n \"acc_stderr\": 0.02526169121972948,\n \"acc_norm\": 0.7352941176470589,\n \"acc_norm_stderr\": 0.02526169121972948\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7234726688102894,\n \"acc_stderr\": 0.02540383297817961,\n \"acc_norm\": 0.7234726688102894,\n \"acc_norm_stderr\": 0.02540383297817961\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7592592592592593,\n \"acc_stderr\": 0.023788583551658533,\n \"acc_norm\": 0.7592592592592593,\n \"acc_norm_stderr\": 0.023788583551658533\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.49645390070921985,\n \"acc_stderr\": 0.02982674915328092,\n \"acc_norm\": 0.49645390070921985,\n \"acc_norm_stderr\": 0.02982674915328092\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4634941329856584,\n \"acc_stderr\": 0.012736153390214961,\n \"acc_norm\": 0.4634941329856584,\n \"acc_norm_stderr\": 0.012736153390214961\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.028418208619406755,\n \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.028418208619406755\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6830065359477124,\n \"acc_stderr\": 0.018824219512706207,\n \"acc_norm\": 0.6830065359477124,\n \"acc_norm_stderr\": 0.018824219512706207\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n \"acc_stderr\": 0.04461272175910509,\n \"acc_norm\": 0.6818181818181818,\n \"acc_norm_stderr\": 0.04461272175910509\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7346938775510204,\n \"acc_stderr\": 0.028263889943784596,\n \"acc_norm\": 0.7346938775510204,\n \"acc_norm_stderr\": 0.028263889943784596\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8557213930348259,\n \"acc_stderr\": 0.024845753212306046,\n \"acc_norm\": 0.8557213930348259,\n \"acc_norm_stderr\": 0.024845753212306046\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.88,\n \"acc_stderr\": 0.03265986323710906,\n \"acc_norm\": 0.88,\n \"acc_norm_stderr\": 0.03265986323710906\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.536144578313253,\n \"acc_stderr\": 0.038823108508905954,\n \"acc_norm\": 0.536144578313253,\n \"acc_norm_stderr\": 0.038823108508905954\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.45532435740514077,\n \"mc1_stderr\": 0.017433490102538772,\n \"mc2\": 0.6240238096985373,\n \"mc2_stderr\": 0.0150858230636782\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.829518547750592,\n \"acc_stderr\": 0.010569021122825909\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.686125852918878,\n \"acc_stderr\": 0.012782681251053201\n }\n}\n```", "repo_url": "https://huggingface.co/Gille/StrangeMerges_4-7B-slerp", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_02T02_32_53.668872", "path": ["**/details_harness|arc:challenge|25_2024-02-02T02-32-53.668872.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-02T02-32-53.668872.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_02T02_32_53.668872", "path": ["**/details_harness|gsm8k|5_2024-02-02T02-32-53.668872.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-02T02-32-53.668872.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_02T02_32_53.668872", "path": ["**/details_harness|hellaswag|10_2024-02-02T02-32-53.668872.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-02T02-32-53.668872.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_02T02_32_53.668872", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T02-32-53.668872.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-02T02-32-53.668872.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-02T02-32-53.668872.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T02-32-53.668872.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T02-32-53.668872.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-02T02-32-53.668872.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T02-32-53.668872.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T02-32-53.668872.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T02-32-53.668872.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T02-32-53.668872.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-02T02-32-53.668872.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-02T02-32-53.668872.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T02-32-53.668872.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-02T02-32-53.668872.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T02-32-53.668872.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T02-32-53.668872.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T02-32-53.668872.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-02T02-32-53.668872.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T02-32-53.668872.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T02-32-53.668872.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T02-32-53.668872.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T02-32-53.668872.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T02-32-53.668872.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T02-32-53.668872.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T02-32-53.668872.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T02-32-53.668872.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T02-32-53.668872.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T02-32-53.668872.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T02-32-53.668872.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T02-32-53.668872.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T02-32-53.668872.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T02-32-53.668872.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-02T02-32-53.668872.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T02-32-53.668872.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-02T02-32-53.668872.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T02-32-53.668872.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T02-32-53.668872.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T02-32-53.668872.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-02T02-32-53.668872.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-02T02-32-53.668872.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T02-32-53.668872.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T02-32-53.668872.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T02-32-53.668872.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T02-32-53.668872.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-02T02-32-53.668872.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-02T02-32-53.668872.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-02T02-32-53.668872.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T02-32-53.668872.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-02T02-32-53.668872.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T02-32-53.668872.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T02-32-53.668872.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-02T02-32-53.668872.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-02T02-32-53.668872.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-02T02-32-53.668872.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T02-32-53.668872.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-02T02-32-53.668872.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-02T02-32-53.668872.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T02-32-53.668872.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-02T02-32-53.668872.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-02T02-32-53.668872.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T02-32-53.668872.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T02-32-53.668872.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-02T02-32-53.668872.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T02-32-53.668872.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T02-32-53.668872.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T02-32-53.668872.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T02-32-53.668872.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-02T02-32-53.668872.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-02T02-32-53.668872.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T02-32-53.668872.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-02T02-32-53.668872.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T02-32-53.668872.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T02-32-53.668872.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T02-32-53.668872.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-02T02-32-53.668872.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T02-32-53.668872.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T02-32-53.668872.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T02-32-53.668872.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T02-32-53.668872.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T02-32-53.668872.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T02-32-53.668872.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T02-32-53.668872.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T02-32-53.668872.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T02-32-53.668872.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T02-32-53.668872.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T02-32-53.668872.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T02-32-53.668872.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T02-32-53.668872.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T02-32-53.668872.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-02T02-32-53.668872.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T02-32-53.668872.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-02T02-32-53.668872.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T02-32-53.668872.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T02-32-53.668872.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T02-32-53.668872.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-02T02-32-53.668872.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-02T02-32-53.668872.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T02-32-53.668872.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T02-32-53.668872.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T02-32-53.668872.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T02-32-53.668872.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-02T02-32-53.668872.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-02T02-32-53.668872.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-02T02-32-53.668872.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T02-32-53.668872.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-02T02-32-53.668872.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T02-32-53.668872.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T02-32-53.668872.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-02T02-32-53.668872.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-02T02-32-53.668872.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-02T02-32-53.668872.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T02-32-53.668872.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-02T02-32-53.668872.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-02T02-32-53.668872.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_02T02_32_53.668872", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T02-32-53.668872.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T02-32-53.668872.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_02T02_32_53.668872", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-02T02-32-53.668872.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-02T02-32-53.668872.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_02T02_32_53.668872", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-02T02-32-53.668872.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-02T02-32-53.668872.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_02T02_32_53.668872", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T02-32-53.668872.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T02-32-53.668872.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_02T02_32_53.668872", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T02-32-53.668872.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T02-32-53.668872.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_02T02_32_53.668872", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-02T02-32-53.668872.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-02T02-32-53.668872.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_02T02_32_53.668872", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T02-32-53.668872.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T02-32-53.668872.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_02T02_32_53.668872", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T02-32-53.668872.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T02-32-53.668872.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_02T02_32_53.668872", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T02-32-53.668872.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T02-32-53.668872.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_02T02_32_53.668872", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T02-32-53.668872.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T02-32-53.668872.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_02T02_32_53.668872", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-02T02-32-53.668872.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-02T02-32-53.668872.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_02T02_32_53.668872", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-02T02-32-53.668872.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-02T02-32-53.668872.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_02T02_32_53.668872", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T02-32-53.668872.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T02-32-53.668872.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_02T02_32_53.668872", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-02T02-32-53.668872.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-02T02-32-53.668872.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_02T02_32_53.668872", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T02-32-53.668872.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T02-32-53.668872.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_02T02_32_53.668872", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T02-32-53.668872.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T02-32-53.668872.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_02T02_32_53.668872", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T02-32-53.668872.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T02-32-53.668872.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_02T02_32_53.668872", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-02T02-32-53.668872.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-02T02-32-53.668872.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_02T02_32_53.668872", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T02-32-53.668872.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T02-32-53.668872.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_02T02_32_53.668872", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T02-32-53.668872.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T02-32-53.668872.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_02T02_32_53.668872", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T02-32-53.668872.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T02-32-53.668872.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_02T02_32_53.668872", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T02-32-53.668872.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T02-32-53.668872.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_02T02_32_53.668872", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T02-32-53.668872.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T02-32-53.668872.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_02T02_32_53.668872", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T02-32-53.668872.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T02-32-53.668872.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_02T02_32_53.668872", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T02-32-53.668872.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T02-32-53.668872.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_02T02_32_53.668872", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T02-32-53.668872.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T02-32-53.668872.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_02T02_32_53.668872", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T02-32-53.668872.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T02-32-53.668872.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_02T02_32_53.668872", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T02-32-53.668872.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T02-32-53.668872.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_02T02_32_53.668872", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T02-32-53.668872.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T02-32-53.668872.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_02T02_32_53.668872", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T02-32-53.668872.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T02-32-53.668872.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_02T02_32_53.668872", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T02-32-53.668872.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T02-32-53.668872.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_02T02_32_53.668872", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T02-32-53.668872.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T02-32-53.668872.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_02T02_32_53.668872", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-02T02-32-53.668872.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-02T02-32-53.668872.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_02T02_32_53.668872", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T02-32-53.668872.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T02-32-53.668872.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_02T02_32_53.668872", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-02T02-32-53.668872.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-02T02-32-53.668872.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_02T02_32_53.668872", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T02-32-53.668872.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T02-32-53.668872.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_02T02_32_53.668872", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T02-32-53.668872.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T02-32-53.668872.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_02T02_32_53.668872", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T02-32-53.668872.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T02-32-53.668872.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_02T02_32_53.668872", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-02T02-32-53.668872.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-02T02-32-53.668872.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_02T02_32_53.668872", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-02T02-32-53.668872.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-02T02-32-53.668872.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_02T02_32_53.668872", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T02-32-53.668872.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T02-32-53.668872.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_02T02_32_53.668872", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T02-32-53.668872.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T02-32-53.668872.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_02T02_32_53.668872", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T02-32-53.668872.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T02-32-53.668872.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_02T02_32_53.668872", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T02-32-53.668872.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T02-32-53.668872.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_02T02_32_53.668872", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-02T02-32-53.668872.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-02T02-32-53.668872.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_02T02_32_53.668872", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-02T02-32-53.668872.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-02T02-32-53.668872.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_02T02_32_53.668872", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-02T02-32-53.668872.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-02T02-32-53.668872.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_02T02_32_53.668872", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T02-32-53.668872.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T02-32-53.668872.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_02T02_32_53.668872", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-02T02-32-53.668872.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-02T02-32-53.668872.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_02T02_32_53.668872", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T02-32-53.668872.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T02-32-53.668872.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_02T02_32_53.668872", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T02-32-53.668872.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T02-32-53.668872.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_02T02_32_53.668872", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-02T02-32-53.668872.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-02T02-32-53.668872.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_02T02_32_53.668872", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-02T02-32-53.668872.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-02T02-32-53.668872.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_02T02_32_53.668872", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-02T02-32-53.668872.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-02T02-32-53.668872.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_02T02_32_53.668872", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T02-32-53.668872.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T02-32-53.668872.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_02T02_32_53.668872", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-02T02-32-53.668872.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-02T02-32-53.668872.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_02T02_32_53.668872", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-02T02-32-53.668872.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-02T02-32-53.668872.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_02T02_32_53.668872", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-02T02-32-53.668872.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-02T02-32-53.668872.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_02T02_32_53.668872", "path": ["**/details_harness|winogrande|5_2024-02-02T02-32-53.668872.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-02T02-32-53.668872.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_02T02_32_53.668872", "path": ["results_2024-02-02T02-32-53.668872.parquet"]}, {"split": "latest", "path": ["results_2024-02-02T02-32-53.668872.parquet"]}]}]}
2024-02-02T02:35:38+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of Gille/StrangeMerges_4-7B-slerp Dataset automatically created during the evaluation run of model Gille/StrangeMerges_4-7B-slerp on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-02-02T02:32:53.668872(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of Gille/StrangeMerges_4-7B-slerp\n\n\n\nDataset automatically created during the evaluation run of model Gille/StrangeMerges_4-7B-slerp on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-02-02T02:32:53.668872(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of Gille/StrangeMerges_4-7B-slerp\n\n\n\nDataset automatically created during the evaluation run of model Gille/StrangeMerges_4-7B-slerp on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-02-02T02:32:53.668872(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
03ebce14b0028bc45b6c34ffbe681335808b3c5c
# Intensified PHOENIX 14-T German Sign Language Dataset <!-- Provide a quick summary of the dataset. --> This is a German-to-German Sign Language (DGS) dataset of weather forecasts. It is a prosodically-enhanced version of the [RWTH-PHOENIX-Weather-2014T](https://www-i6.informatik.rwth-aachen.de/~koller/RWTH-PHOENIX-2014-T/) dataset. ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [Mert Inan] - **Language(s) (NLP):** German, DGS (German Sign Language) ### Dataset Sources [optional] - **Repository:** [Modeling Intensification for Sign Language Generation](https://github.com/Merterm/Modeling-Intensification-for-SLG/tree/main) - **Paper:** [Modeling Intensification for Sign Language Generation: A Computational Approach @ ACL 2022](https://aclanthology.org/2022.findings-acl.228/) - **Demo:** [Video Explanation & Demo](https://aclanthology.org/2022.findings-acl.228.mp4) ## Uses <!-- Address questions around how the dataset is intended to be used. --> The dataset is used for sign language generation in the original paper. The data contains parallel samples between German, German Sign Language (DGS) glosses, and German Sign Language (DGS) skeletal coordinates in the OpenPose format without the face. ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** ~~~ @inproceedings{inan-etal-2022-modeling, title = "Modeling Intensification for Sign Language Generation: A Computational Approach", author = "Inan, Mert and Zhong, Yang and Hassan, Sabit and Quandt, Lorna and Alikhani, Malihe", editor = "Muresan, Smaranda and Nakov, Preslav and Villavicencio, Aline", booktitle = "Findings of the Association for Computational Linguistics: ACL 2022", month = may, year = "2022", address = "Dublin, Ireland", publisher = "Association for Computational Linguistics", url = "https://aclanthology.org/2022.findings-acl.228", doi = "10.18653/v1/2022.findings-acl.228", pages = "2897--2911", abstract = "End-to-end sign language generation models do not accurately represent the prosody in sign language. A lack of temporal and spatial variations leads to poor-quality generated presentations that confuse human interpreters. In this paper, we aim to improve the prosody in generated sign languages by modeling intensification in a data-driven manner. We present different strategies grounded in linguistics of sign language that inform how intensity modifiers can be represented in gloss annotations. To employ our strategies, we first annotate a subset of the benchmark PHOENIX-14T, a German Sign Language dataset, with different levels of intensification. We then use a supervised intensity tagger to extend the annotated dataset and obtain labels for the remaining portion of it. This enhanced dataset is then used to train state-of-the-art transformer models for sign language generation. We find that our efforts in intensification modeling yield better results when evaluated with automatic metrics. Human evaluation also indicates a higher preference of the videos generated using our model.", } ~~~ **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
merterm/intensified-phoenix-14-t
[ "task_categories:text-generation", "size_categories:1K<n<10K", "language:de", "license:mit", "region:us" ]
2024-02-02T02:40:18+00:00
{"language": ["de"], "license": "mit", "size_categories": ["1K<n<10K"], "task_categories": ["text-generation"]}
2024-02-02T03:19:50+00:00
[]
[ "de" ]
TAGS #task_categories-text-generation #size_categories-1K<n<10K #language-German #license-mit #region-us
# Intensified PHOENIX 14-T German Sign Language Dataset This is a German-to-German Sign Language (DGS) dataset of weather forecasts. It is a prosodically-enhanced version of the RWTH-PHOENIX-Weather-2014T dataset. ## Dataset Details ### Dataset Description - Curated by: [Mert Inan] - Language(s) (NLP): German, DGS (German Sign Language) ### Dataset Sources [optional] - Repository: Modeling Intensification for Sign Language Generation - Paper: Modeling Intensification for Sign Language Generation: A Computational Approach @ ACL 2022 - Demo: Video Explanation & Demo ## Uses The dataset is used for sign language generation in the original paper. The data contains parallel samples between German, German Sign Language (DGS) glosses, and German Sign Language (DGS) skeletal coordinates in the OpenPose format without the face. ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: ~~~ @inproceedings{inan-etal-2022-modeling, title = "Modeling Intensification for Sign Language Generation: A Computational Approach", author = "Inan, Mert and Zhong, Yang and Hassan, Sabit and Quandt, Lorna and Alikhani, Malihe", editor = "Muresan, Smaranda and Nakov, Preslav and Villavicencio, Aline", booktitle = "Findings of the Association for Computational Linguistics: ACL 2022", month = may, year = "2022", address = "Dublin, Ireland", publisher = "Association for Computational Linguistics", url = "URL doi = "10.18653/v1/2022.findings-acl.228", pages = "2897--2911", abstract = "End-to-end sign language generation models do not accurately represent the prosody in sign language. A lack of temporal and spatial variations leads to poor-quality generated presentations that confuse human interpreters. In this paper, we aim to improve the prosody in generated sign languages by modeling intensification in a data-driven manner. We present different strategies grounded in linguistics of sign language that inform how intensity modifiers can be represented in gloss annotations. To employ our strategies, we first annotate a subset of the benchmark PHOENIX-14T, a German Sign Language dataset, with different levels of intensification. We then use a supervised intensity tagger to extend the annotated dataset and obtain labels for the remaining portion of it. This enhanced dataset is then used to train state-of-the-art transformer models for sign language generation. We find that our efforts in intensification modeling yield better results when evaluated with automatic metrics. Human evaluation also indicates a higher preference of the videos generated using our model.", } ~~~ APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Intensified PHOENIX 14-T German Sign Language Dataset\n\n\n\nThis is a German-to-German Sign Language (DGS) dataset of weather forecasts. It is a prosodically-enhanced version of the RWTH-PHOENIX-Weather-2014T dataset.", "## Dataset Details", "### Dataset Description\n\n\n\n- Curated by: [Mert Inan]\n- Language(s) (NLP): German, DGS (German Sign Language)", "### Dataset Sources [optional]\n\n- Repository: Modeling Intensification for Sign Language Generation\n- Paper: Modeling Intensification for Sign Language Generation: A Computational Approach @ ACL 2022\n- Demo: Video Explanation & Demo", "## Uses\n\n\n\nThe dataset is used for sign language generation in the original paper. The data contains parallel samples between German, German Sign Language (DGS) glosses, and German Sign Language (DGS) skeletal coordinates in the OpenPose format without the face.", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n~~~\n@inproceedings{inan-etal-2022-modeling,\n title = \"Modeling Intensification for Sign Language Generation: A Computational Approach\",\n author = \"Inan, Mert and\n Zhong, Yang and\n Hassan, Sabit and\n Quandt, Lorna and\n Alikhani, Malihe\",\n editor = \"Muresan, Smaranda and\n Nakov, Preslav and\n Villavicencio, Aline\",\n booktitle = \"Findings of the Association for Computational Linguistics: ACL 2022\",\n month = may,\n year = \"2022\",\n address = \"Dublin, Ireland\",\n publisher = \"Association for Computational Linguistics\",\n url = \"URL\n doi = \"10.18653/v1/2022.findings-acl.228\",\n pages = \"2897--2911\",\n abstract = \"End-to-end sign language generation models do not accurately represent the prosody in sign language. A lack of temporal and spatial variations leads to poor-quality generated presentations that confuse human interpreters. In this paper, we aim to improve the prosody in generated sign languages by modeling intensification in a data-driven manner. We present different strategies grounded in linguistics of sign language that inform how intensity modifiers can be represented in gloss annotations. To employ our strategies, we first annotate a subset of the benchmark PHOENIX-14T, a German Sign Language dataset, with different levels of intensification. We then use a supervised intensity tagger to extend the annotated dataset and obtain labels for the remaining portion of it. This enhanced dataset is then used to train state-of-the-art transformer models for sign language generation. We find that our efforts in intensification modeling yield better results when evaluated with automatic metrics. Human evaluation also indicates a higher preference of the videos generated using our model.\",\n}\n~~~\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#task_categories-text-generation #size_categories-1K<n<10K #language-German #license-mit #region-us \n", "# Intensified PHOENIX 14-T German Sign Language Dataset\n\n\n\nThis is a German-to-German Sign Language (DGS) dataset of weather forecasts. It is a prosodically-enhanced version of the RWTH-PHOENIX-Weather-2014T dataset.", "## Dataset Details", "### Dataset Description\n\n\n\n- Curated by: [Mert Inan]\n- Language(s) (NLP): German, DGS (German Sign Language)", "### Dataset Sources [optional]\n\n- Repository: Modeling Intensification for Sign Language Generation\n- Paper: Modeling Intensification for Sign Language Generation: A Computational Approach @ ACL 2022\n- Demo: Video Explanation & Demo", "## Uses\n\n\n\nThe dataset is used for sign language generation in the original paper. The data contains parallel samples between German, German Sign Language (DGS) glosses, and German Sign Language (DGS) skeletal coordinates in the OpenPose format without the face.", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n~~~\n@inproceedings{inan-etal-2022-modeling,\n title = \"Modeling Intensification for Sign Language Generation: A Computational Approach\",\n author = \"Inan, Mert and\n Zhong, Yang and\n Hassan, Sabit and\n Quandt, Lorna and\n Alikhani, Malihe\",\n editor = \"Muresan, Smaranda and\n Nakov, Preslav and\n Villavicencio, Aline\",\n booktitle = \"Findings of the Association for Computational Linguistics: ACL 2022\",\n month = may,\n year = \"2022\",\n address = \"Dublin, Ireland\",\n publisher = \"Association for Computational Linguistics\",\n url = \"URL\n doi = \"10.18653/v1/2022.findings-acl.228\",\n pages = \"2897--2911\",\n abstract = \"End-to-end sign language generation models do not accurately represent the prosody in sign language. A lack of temporal and spatial variations leads to poor-quality generated presentations that confuse human interpreters. In this paper, we aim to improve the prosody in generated sign languages by modeling intensification in a data-driven manner. We present different strategies grounded in linguistics of sign language that inform how intensity modifiers can be represented in gloss annotations. To employ our strategies, we first annotate a subset of the benchmark PHOENIX-14T, a German Sign Language dataset, with different levels of intensification. We then use a supervised intensity tagger to extend the annotated dataset and obtain labels for the remaining portion of it. This enhanced dataset is then used to train state-of-the-art transformer models for sign language generation. We find that our efforts in intensification modeling yield better results when evaluated with automatic metrics. Human evaluation also indicates a higher preference of the videos generated using our model.\",\n}\n~~~\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
5f7e7519ab49823cd0fd33a2eb75205775fc31e3
# Dataset Card for Evaluation run of Gille/StrangeMerges_2-7B-slerp <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [Gille/StrangeMerges_2-7B-slerp](https://huggingface.co/Gille/StrangeMerges_2-7B-slerp) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_Gille__StrangeMerges_2-7B-slerp", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-02-02T02:38:12.814643](https://huggingface.co/datasets/open-llm-leaderboard/details_Gille__StrangeMerges_2-7B-slerp/blob/main/results_2024-02-02T02-38-12.814643.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6542232547056861, "acc_stderr": 0.03183153178270036, "acc_norm": 0.6559270430918996, "acc_norm_stderr": 0.03247560880535698, "mc1": 0.38310893512851896, "mc1_stderr": 0.017018461679389855, "mc2": 0.5453492264058105, "mc2_stderr": 0.015070179185167773 }, "harness|arc:challenge|25": { "acc": 0.6313993174061433, "acc_stderr": 0.014097810678042192, "acc_norm": 0.6689419795221843, "acc_norm_stderr": 0.01375206241981783 }, "harness|hellaswag|10": { "acc": 0.6618203545110536, "acc_stderr": 0.004721231637092722, "acc_norm": 0.8552081258713403, "acc_norm_stderr": 0.0035117170854519846 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.34, "acc_stderr": 0.04760952285695236, "acc_norm": 0.34, "acc_norm_stderr": 0.04760952285695236 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.6, "acc_stderr": 0.042320736951515885, "acc_norm": 0.6, "acc_norm_stderr": 0.042320736951515885 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.6907894736842105, "acc_stderr": 0.037610708698674805, "acc_norm": 0.6907894736842105, "acc_norm_stderr": 0.037610708698674805 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.61, "acc_stderr": 0.04902071300001975, "acc_norm": 0.61, "acc_norm_stderr": 0.04902071300001975 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.720754716981132, "acc_stderr": 0.027611163402399715, "acc_norm": 0.720754716981132, "acc_norm_stderr": 0.027611163402399715 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.7638888888888888, "acc_stderr": 0.03551446610810826, "acc_norm": 0.7638888888888888, "acc_norm_stderr": 0.03551446610810826 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.48, "acc_stderr": 0.050211673156867795, "acc_norm": 0.48, "acc_norm_stderr": 0.050211673156867795 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.51, "acc_stderr": 0.05024183937956912, "acc_norm": 0.51, "acc_norm_stderr": 0.05024183937956912 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.31, "acc_stderr": 0.04648231987117316, "acc_norm": 0.31, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6705202312138728, "acc_stderr": 0.03583901754736412, "acc_norm": 0.6705202312138728, "acc_norm_stderr": 0.03583901754736412 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.39215686274509803, "acc_stderr": 0.048580835742663454, "acc_norm": 0.39215686274509803, "acc_norm_stderr": 0.048580835742663454 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.81, "acc_stderr": 0.039427724440366234, "acc_norm": 0.81, "acc_norm_stderr": 0.039427724440366234 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.6, "acc_stderr": 0.03202563076101735, "acc_norm": 0.6, "acc_norm_stderr": 0.03202563076101735 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.49122807017543857, "acc_stderr": 0.04702880432049615, "acc_norm": 0.49122807017543857, "acc_norm_stderr": 0.04702880432049615 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5655172413793104, "acc_stderr": 0.04130740879555497, "acc_norm": 0.5655172413793104, "acc_norm_stderr": 0.04130740879555497 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.42857142857142855, "acc_stderr": 0.02548718714785938, "acc_norm": 0.42857142857142855, "acc_norm_stderr": 0.02548718714785938 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.4603174603174603, "acc_stderr": 0.04458029125470973, "acc_norm": 0.4603174603174603, "acc_norm_stderr": 0.04458029125470973 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.35, "acc_stderr": 0.047937248544110196, "acc_norm": 0.35, "acc_norm_stderr": 0.047937248544110196 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.7870967741935484, "acc_stderr": 0.023287665127268542, "acc_norm": 0.7870967741935484, "acc_norm_stderr": 0.023287665127268542 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.5123152709359606, "acc_stderr": 0.035169204442208966, "acc_norm": 0.5123152709359606, "acc_norm_stderr": 0.035169204442208966 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.7, "acc_stderr": 0.046056618647183814, "acc_norm": 0.7, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7818181818181819, "acc_stderr": 0.03225078108306289, "acc_norm": 0.7818181818181819, "acc_norm_stderr": 0.03225078108306289 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.797979797979798, "acc_stderr": 0.028606204289229872, "acc_norm": 0.797979797979798, "acc_norm_stderr": 0.028606204289229872 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.8963730569948186, "acc_stderr": 0.02199531196364424, "acc_norm": 0.8963730569948186, "acc_norm_stderr": 0.02199531196364424 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.6846153846153846, "acc_stderr": 0.023559646983189946, "acc_norm": 0.6846153846153846, "acc_norm_stderr": 0.023559646983189946 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.35555555555555557, "acc_stderr": 0.029185714949857403, "acc_norm": 0.35555555555555557, "acc_norm_stderr": 0.029185714949857403 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.680672268907563, "acc_stderr": 0.0302839955258844, "acc_norm": 0.680672268907563, "acc_norm_stderr": 0.0302839955258844 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.3443708609271523, "acc_stderr": 0.038796870240733264, "acc_norm": 0.3443708609271523, "acc_norm_stderr": 0.038796870240733264 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8422018348623853, "acc_stderr": 0.015630022970092448, "acc_norm": 0.8422018348623853, "acc_norm_stderr": 0.015630022970092448 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.5370370370370371, "acc_stderr": 0.03400603625538271, "acc_norm": 0.5370370370370371, "acc_norm_stderr": 0.03400603625538271 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.8186274509803921, "acc_stderr": 0.027044621719474086, "acc_norm": 0.8186274509803921, "acc_norm_stderr": 0.027044621719474086 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.7974683544303798, "acc_stderr": 0.02616056824660146, "acc_norm": 0.7974683544303798, "acc_norm_stderr": 0.02616056824660146 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6860986547085202, "acc_stderr": 0.031146796482972465, "acc_norm": 0.6860986547085202, "acc_norm_stderr": 0.031146796482972465 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.8015267175572519, "acc_stderr": 0.03498149385462472, "acc_norm": 0.8015267175572519, "acc_norm_stderr": 0.03498149385462472 }, "harness|hendrycksTest-international_law|5": { "acc": 0.8016528925619835, "acc_stderr": 0.03640118271990946, "acc_norm": 0.8016528925619835, "acc_norm_stderr": 0.03640118271990946 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.8148148148148148, "acc_stderr": 0.03755265865037182, "acc_norm": 0.8148148148148148, "acc_norm_stderr": 0.03755265865037182 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7791411042944786, "acc_stderr": 0.03259177392742178, "acc_norm": 0.7791411042944786, "acc_norm_stderr": 0.03259177392742178 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.49107142857142855, "acc_stderr": 0.04745033255489123, "acc_norm": 0.49107142857142855, "acc_norm_stderr": 0.04745033255489123 }, "harness|hendrycksTest-management|5": { "acc": 0.8058252427184466, "acc_stderr": 0.039166677628225836, "acc_norm": 0.8058252427184466, "acc_norm_stderr": 0.039166677628225836 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8717948717948718, "acc_stderr": 0.021901905115073325, "acc_norm": 0.8717948717948718, "acc_norm_stderr": 0.021901905115073325 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.76, "acc_stderr": 0.042923469599092816, "acc_norm": 0.76, "acc_norm_stderr": 0.042923469599092816 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8263090676883781, "acc_stderr": 0.01354741565866226, "acc_norm": 0.8263090676883781, "acc_norm_stderr": 0.01354741565866226 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.7456647398843931, "acc_stderr": 0.023445826276545543, "acc_norm": 0.7456647398843931, "acc_norm_stderr": 0.023445826276545543 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.3843575418994413, "acc_stderr": 0.016269088663959402, "acc_norm": 0.3843575418994413, "acc_norm_stderr": 0.016269088663959402 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.7483660130718954, "acc_stderr": 0.0248480182638752, "acc_norm": 0.7483660130718954, "acc_norm_stderr": 0.0248480182638752 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.7170418006430869, "acc_stderr": 0.02558306248998481, "acc_norm": 0.7170418006430869, "acc_norm_stderr": 0.02558306248998481 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.7623456790123457, "acc_stderr": 0.02368359183700856, "acc_norm": 0.7623456790123457, "acc_norm_stderr": 0.02368359183700856 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.4858156028368794, "acc_stderr": 0.02981549448368206, "acc_norm": 0.4858156028368794, "acc_norm_stderr": 0.02981549448368206 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.46153846153846156, "acc_stderr": 0.01273239828619044, "acc_norm": 0.46153846153846156, "acc_norm_stderr": 0.01273239828619044 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.6875, "acc_stderr": 0.02815637344037142, "acc_norm": 0.6875, "acc_norm_stderr": 0.02815637344037142 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6813725490196079, "acc_stderr": 0.018850084696468712, "acc_norm": 0.6813725490196079, "acc_norm_stderr": 0.018850084696468712 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6727272727272727, "acc_stderr": 0.0449429086625209, "acc_norm": 0.6727272727272727, "acc_norm_stderr": 0.0449429086625209 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.7428571428571429, "acc_stderr": 0.02797982353874455, "acc_norm": 0.7428571428571429, "acc_norm_stderr": 0.02797982353874455 }, "harness|hendrycksTest-sociology|5": { "acc": 0.8756218905472637, "acc_stderr": 0.023335401790166327, "acc_norm": 0.8756218905472637, "acc_norm_stderr": 0.023335401790166327 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.87, "acc_stderr": 0.033799766898963086, "acc_norm": 0.87, "acc_norm_stderr": 0.033799766898963086 }, "harness|hendrycksTest-virology|5": { "acc": 0.5301204819277109, "acc_stderr": 0.03885425420866767, "acc_norm": 0.5301204819277109, "acc_norm_stderr": 0.03885425420866767 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8421052631578947, "acc_stderr": 0.027966785859160893, "acc_norm": 0.8421052631578947, "acc_norm_stderr": 0.027966785859160893 }, "harness|truthfulqa:mc|0": { "mc1": 0.38310893512851896, "mc1_stderr": 0.017018461679389855, "mc2": 0.5453492264058105, "mc2_stderr": 0.015070179185167773 }, "harness|winogrande|5": { "acc": 0.823993685872139, "acc_stderr": 0.010703090882320705 }, "harness|gsm8k|5": { "acc": 0.6148597422289613, "acc_stderr": 0.013404165536474303 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_Gille__StrangeMerges_2-7B-slerp
[ "region:us" ]
2024-02-02T02:40:32+00:00
{"pretty_name": "Evaluation run of Gille/StrangeMerges_2-7B-slerp", "dataset_summary": "Dataset automatically created during the evaluation run of model [Gille/StrangeMerges_2-7B-slerp](https://huggingface.co/Gille/StrangeMerges_2-7B-slerp) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Gille__StrangeMerges_2-7B-slerp\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-02T02:38:12.814643](https://huggingface.co/datasets/open-llm-leaderboard/details_Gille__StrangeMerges_2-7B-slerp/blob/main/results_2024-02-02T02-38-12.814643.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6542232547056861,\n \"acc_stderr\": 0.03183153178270036,\n \"acc_norm\": 0.6559270430918996,\n \"acc_norm_stderr\": 0.03247560880535698,\n \"mc1\": 0.38310893512851896,\n \"mc1_stderr\": 0.017018461679389855,\n \"mc2\": 0.5453492264058105,\n \"mc2_stderr\": 0.015070179185167773\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6313993174061433,\n \"acc_stderr\": 0.014097810678042192,\n \"acc_norm\": 0.6689419795221843,\n \"acc_norm_stderr\": 0.01375206241981783\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6618203545110536,\n \"acc_stderr\": 0.004721231637092722,\n \"acc_norm\": 0.8552081258713403,\n \"acc_norm_stderr\": 0.0035117170854519846\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.042320736951515885,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.042320736951515885\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6907894736842105,\n \"acc_stderr\": 0.037610708698674805,\n \"acc_norm\": 0.6907894736842105,\n \"acc_norm_stderr\": 0.037610708698674805\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.61,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.720754716981132,\n \"acc_stderr\": 0.027611163402399715,\n \"acc_norm\": 0.720754716981132,\n \"acc_norm_stderr\": 0.027611163402399715\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7638888888888888,\n \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.7638888888888888,\n \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6705202312138728,\n \"acc_stderr\": 0.03583901754736412,\n \"acc_norm\": 0.6705202312138728,\n \"acc_norm_stderr\": 0.03583901754736412\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.048580835742663454,\n \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.048580835742663454\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.81,\n \"acc_stderr\": 0.039427724440366234,\n \"acc_norm\": 0.81,\n \"acc_norm_stderr\": 0.039427724440366234\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.03202563076101735,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.03202563076101735\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.49122807017543857,\n \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.49122807017543857,\n \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5655172413793104,\n \"acc_stderr\": 0.04130740879555497,\n \"acc_norm\": 0.5655172413793104,\n \"acc_norm_stderr\": 0.04130740879555497\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.42857142857142855,\n \"acc_stderr\": 0.02548718714785938,\n \"acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.02548718714785938\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4603174603174603,\n \"acc_stderr\": 0.04458029125470973,\n \"acc_norm\": 0.4603174603174603,\n \"acc_norm_stderr\": 0.04458029125470973\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7870967741935484,\n \"acc_stderr\": 0.023287665127268542,\n \"acc_norm\": 0.7870967741935484,\n \"acc_norm_stderr\": 0.023287665127268542\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5123152709359606,\n \"acc_stderr\": 0.035169204442208966,\n \"acc_norm\": 0.5123152709359606,\n \"acc_norm_stderr\": 0.035169204442208966\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7818181818181819,\n \"acc_stderr\": 0.03225078108306289,\n \"acc_norm\": 0.7818181818181819,\n \"acc_norm_stderr\": 0.03225078108306289\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.797979797979798,\n \"acc_stderr\": 0.028606204289229872,\n \"acc_norm\": 0.797979797979798,\n \"acc_norm_stderr\": 0.028606204289229872\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8963730569948186,\n \"acc_stderr\": 0.02199531196364424,\n \"acc_norm\": 0.8963730569948186,\n \"acc_norm_stderr\": 0.02199531196364424\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6846153846153846,\n \"acc_stderr\": 0.023559646983189946,\n \"acc_norm\": 0.6846153846153846,\n \"acc_norm_stderr\": 0.023559646983189946\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.35555555555555557,\n \"acc_stderr\": 0.029185714949857403,\n \"acc_norm\": 0.35555555555555557,\n \"acc_norm_stderr\": 0.029185714949857403\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.680672268907563,\n \"acc_stderr\": 0.0302839955258844,\n \"acc_norm\": 0.680672268907563,\n \"acc_norm_stderr\": 0.0302839955258844\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3443708609271523,\n \"acc_stderr\": 0.038796870240733264,\n \"acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.038796870240733264\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8422018348623853,\n \"acc_stderr\": 0.015630022970092448,\n \"acc_norm\": 0.8422018348623853,\n \"acc_norm_stderr\": 0.015630022970092448\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5370370370370371,\n \"acc_stderr\": 0.03400603625538271,\n \"acc_norm\": 0.5370370370370371,\n \"acc_norm_stderr\": 0.03400603625538271\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8186274509803921,\n \"acc_stderr\": 0.027044621719474086,\n \"acc_norm\": 0.8186274509803921,\n \"acc_norm_stderr\": 0.027044621719474086\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7974683544303798,\n \"acc_stderr\": 0.02616056824660146,\n \"acc_norm\": 0.7974683544303798,\n \"acc_norm_stderr\": 0.02616056824660146\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8015267175572519,\n \"acc_stderr\": 0.03498149385462472,\n \"acc_norm\": 0.8015267175572519,\n \"acc_norm_stderr\": 0.03498149385462472\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8016528925619835,\n \"acc_stderr\": 0.03640118271990946,\n \"acc_norm\": 0.8016528925619835,\n \"acc_norm_stderr\": 0.03640118271990946\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8148148148148148,\n \"acc_stderr\": 0.03755265865037182,\n \"acc_norm\": 0.8148148148148148,\n \"acc_norm_stderr\": 0.03755265865037182\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7791411042944786,\n \"acc_stderr\": 0.03259177392742178,\n \"acc_norm\": 0.7791411042944786,\n \"acc_norm_stderr\": 0.03259177392742178\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.49107142857142855,\n \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.49107142857142855,\n \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8058252427184466,\n \"acc_stderr\": 0.039166677628225836,\n \"acc_norm\": 0.8058252427184466,\n \"acc_norm_stderr\": 0.039166677628225836\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8717948717948718,\n \"acc_stderr\": 0.021901905115073325,\n \"acc_norm\": 0.8717948717948718,\n \"acc_norm_stderr\": 0.021901905115073325\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8263090676883781,\n \"acc_stderr\": 0.01354741565866226,\n \"acc_norm\": 0.8263090676883781,\n \"acc_norm_stderr\": 0.01354741565866226\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7456647398843931,\n \"acc_stderr\": 0.023445826276545543,\n \"acc_norm\": 0.7456647398843931,\n \"acc_norm_stderr\": 0.023445826276545543\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3843575418994413,\n \"acc_stderr\": 0.016269088663959402,\n \"acc_norm\": 0.3843575418994413,\n \"acc_norm_stderr\": 0.016269088663959402\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7483660130718954,\n \"acc_stderr\": 0.0248480182638752,\n \"acc_norm\": 0.7483660130718954,\n \"acc_norm_stderr\": 0.0248480182638752\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7170418006430869,\n \"acc_stderr\": 0.02558306248998481,\n \"acc_norm\": 0.7170418006430869,\n \"acc_norm_stderr\": 0.02558306248998481\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7623456790123457,\n \"acc_stderr\": 0.02368359183700856,\n \"acc_norm\": 0.7623456790123457,\n \"acc_norm_stderr\": 0.02368359183700856\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4858156028368794,\n \"acc_stderr\": 0.02981549448368206,\n \"acc_norm\": 0.4858156028368794,\n \"acc_norm_stderr\": 0.02981549448368206\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46153846153846156,\n \"acc_stderr\": 0.01273239828619044,\n \"acc_norm\": 0.46153846153846156,\n \"acc_norm_stderr\": 0.01273239828619044\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6875,\n \"acc_stderr\": 0.02815637344037142,\n \"acc_norm\": 0.6875,\n \"acc_norm_stderr\": 0.02815637344037142\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6813725490196079,\n \"acc_stderr\": 0.018850084696468712,\n \"acc_norm\": 0.6813725490196079,\n \"acc_norm_stderr\": 0.018850084696468712\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7428571428571429,\n \"acc_stderr\": 0.02797982353874455,\n \"acc_norm\": 0.7428571428571429,\n \"acc_norm_stderr\": 0.02797982353874455\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8756218905472637,\n \"acc_stderr\": 0.023335401790166327,\n \"acc_norm\": 0.8756218905472637,\n \"acc_norm_stderr\": 0.023335401790166327\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.87,\n \"acc_stderr\": 0.033799766898963086,\n \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.033799766898963086\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5301204819277109,\n \"acc_stderr\": 0.03885425420866767,\n \"acc_norm\": 0.5301204819277109,\n \"acc_norm_stderr\": 0.03885425420866767\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8421052631578947,\n \"acc_stderr\": 0.027966785859160893,\n \"acc_norm\": 0.8421052631578947,\n \"acc_norm_stderr\": 0.027966785859160893\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.38310893512851896,\n \"mc1_stderr\": 0.017018461679389855,\n \"mc2\": 0.5453492264058105,\n \"mc2_stderr\": 0.015070179185167773\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.823993685872139,\n \"acc_stderr\": 0.010703090882320705\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6148597422289613,\n \"acc_stderr\": 0.013404165536474303\n }\n}\n```", "repo_url": "https://huggingface.co/Gille/StrangeMerges_2-7B-slerp", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_02T02_38_12.814643", "path": ["**/details_harness|arc:challenge|25_2024-02-02T02-38-12.814643.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-02T02-38-12.814643.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_02T02_38_12.814643", "path": ["**/details_harness|gsm8k|5_2024-02-02T02-38-12.814643.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-02T02-38-12.814643.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_02T02_38_12.814643", "path": ["**/details_harness|hellaswag|10_2024-02-02T02-38-12.814643.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-02T02-38-12.814643.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_02T02_38_12.814643", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T02-38-12.814643.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-02T02-38-12.814643.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-02T02-38-12.814643.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T02-38-12.814643.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T02-38-12.814643.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-02T02-38-12.814643.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T02-38-12.814643.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T02-38-12.814643.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T02-38-12.814643.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T02-38-12.814643.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-02T02-38-12.814643.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-02T02-38-12.814643.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T02-38-12.814643.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-02T02-38-12.814643.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T02-38-12.814643.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T02-38-12.814643.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T02-38-12.814643.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-02T02-38-12.814643.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T02-38-12.814643.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T02-38-12.814643.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T02-38-12.814643.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T02-38-12.814643.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T02-38-12.814643.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T02-38-12.814643.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T02-38-12.814643.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T02-38-12.814643.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T02-38-12.814643.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T02-38-12.814643.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T02-38-12.814643.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T02-38-12.814643.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T02-38-12.814643.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T02-38-12.814643.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-02T02-38-12.814643.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T02-38-12.814643.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-02T02-38-12.814643.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T02-38-12.814643.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T02-38-12.814643.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T02-38-12.814643.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-02T02-38-12.814643.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-02T02-38-12.814643.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T02-38-12.814643.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T02-38-12.814643.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T02-38-12.814643.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T02-38-12.814643.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-02T02-38-12.814643.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-02T02-38-12.814643.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-02T02-38-12.814643.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T02-38-12.814643.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-02T02-38-12.814643.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T02-38-12.814643.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T02-38-12.814643.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-02T02-38-12.814643.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-02T02-38-12.814643.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-02T02-38-12.814643.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T02-38-12.814643.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-02T02-38-12.814643.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-02T02-38-12.814643.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T02-38-12.814643.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-02T02-38-12.814643.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-02T02-38-12.814643.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T02-38-12.814643.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T02-38-12.814643.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-02T02-38-12.814643.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T02-38-12.814643.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T02-38-12.814643.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T02-38-12.814643.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T02-38-12.814643.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-02T02-38-12.814643.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-02T02-38-12.814643.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T02-38-12.814643.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-02T02-38-12.814643.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T02-38-12.814643.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T02-38-12.814643.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T02-38-12.814643.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-02T02-38-12.814643.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T02-38-12.814643.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T02-38-12.814643.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T02-38-12.814643.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T02-38-12.814643.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T02-38-12.814643.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T02-38-12.814643.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T02-38-12.814643.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T02-38-12.814643.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T02-38-12.814643.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T02-38-12.814643.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T02-38-12.814643.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T02-38-12.814643.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T02-38-12.814643.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T02-38-12.814643.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-02T02-38-12.814643.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T02-38-12.814643.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-02T02-38-12.814643.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T02-38-12.814643.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T02-38-12.814643.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T02-38-12.814643.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-02T02-38-12.814643.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-02T02-38-12.814643.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T02-38-12.814643.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T02-38-12.814643.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T02-38-12.814643.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T02-38-12.814643.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-02T02-38-12.814643.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-02T02-38-12.814643.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-02T02-38-12.814643.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T02-38-12.814643.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-02T02-38-12.814643.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T02-38-12.814643.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T02-38-12.814643.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-02T02-38-12.814643.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-02T02-38-12.814643.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-02T02-38-12.814643.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T02-38-12.814643.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-02T02-38-12.814643.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-02T02-38-12.814643.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_02T02_38_12.814643", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T02-38-12.814643.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T02-38-12.814643.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_02T02_38_12.814643", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-02T02-38-12.814643.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-02T02-38-12.814643.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_02T02_38_12.814643", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-02T02-38-12.814643.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-02T02-38-12.814643.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_02T02_38_12.814643", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T02-38-12.814643.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T02-38-12.814643.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_02T02_38_12.814643", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T02-38-12.814643.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T02-38-12.814643.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_02T02_38_12.814643", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-02T02-38-12.814643.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-02T02-38-12.814643.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_02T02_38_12.814643", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T02-38-12.814643.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T02-38-12.814643.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_02T02_38_12.814643", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T02-38-12.814643.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T02-38-12.814643.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_02T02_38_12.814643", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T02-38-12.814643.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T02-38-12.814643.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_02T02_38_12.814643", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T02-38-12.814643.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T02-38-12.814643.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_02T02_38_12.814643", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-02T02-38-12.814643.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-02T02-38-12.814643.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_02T02_38_12.814643", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-02T02-38-12.814643.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-02T02-38-12.814643.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_02T02_38_12.814643", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T02-38-12.814643.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T02-38-12.814643.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_02T02_38_12.814643", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-02T02-38-12.814643.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-02T02-38-12.814643.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_02T02_38_12.814643", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T02-38-12.814643.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T02-38-12.814643.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_02T02_38_12.814643", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T02-38-12.814643.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T02-38-12.814643.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_02T02_38_12.814643", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T02-38-12.814643.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T02-38-12.814643.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_02T02_38_12.814643", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-02T02-38-12.814643.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-02T02-38-12.814643.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_02T02_38_12.814643", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T02-38-12.814643.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T02-38-12.814643.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_02T02_38_12.814643", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T02-38-12.814643.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T02-38-12.814643.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_02T02_38_12.814643", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T02-38-12.814643.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T02-38-12.814643.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_02T02_38_12.814643", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T02-38-12.814643.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T02-38-12.814643.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_02T02_38_12.814643", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T02-38-12.814643.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T02-38-12.814643.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_02T02_38_12.814643", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T02-38-12.814643.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T02-38-12.814643.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_02T02_38_12.814643", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T02-38-12.814643.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T02-38-12.814643.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_02T02_38_12.814643", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T02-38-12.814643.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T02-38-12.814643.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_02T02_38_12.814643", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T02-38-12.814643.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T02-38-12.814643.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_02T02_38_12.814643", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T02-38-12.814643.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T02-38-12.814643.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_02T02_38_12.814643", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T02-38-12.814643.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T02-38-12.814643.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_02T02_38_12.814643", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T02-38-12.814643.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T02-38-12.814643.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_02T02_38_12.814643", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T02-38-12.814643.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T02-38-12.814643.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_02T02_38_12.814643", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T02-38-12.814643.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T02-38-12.814643.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_02T02_38_12.814643", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-02T02-38-12.814643.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-02T02-38-12.814643.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_02T02_38_12.814643", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T02-38-12.814643.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T02-38-12.814643.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_02T02_38_12.814643", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-02T02-38-12.814643.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-02T02-38-12.814643.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_02T02_38_12.814643", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T02-38-12.814643.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T02-38-12.814643.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_02T02_38_12.814643", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T02-38-12.814643.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T02-38-12.814643.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_02T02_38_12.814643", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T02-38-12.814643.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T02-38-12.814643.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_02T02_38_12.814643", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-02T02-38-12.814643.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-02T02-38-12.814643.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_02T02_38_12.814643", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-02T02-38-12.814643.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-02T02-38-12.814643.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_02T02_38_12.814643", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T02-38-12.814643.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T02-38-12.814643.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_02T02_38_12.814643", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T02-38-12.814643.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T02-38-12.814643.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_02T02_38_12.814643", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T02-38-12.814643.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T02-38-12.814643.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_02T02_38_12.814643", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T02-38-12.814643.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T02-38-12.814643.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_02T02_38_12.814643", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-02T02-38-12.814643.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-02T02-38-12.814643.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_02T02_38_12.814643", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-02T02-38-12.814643.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-02T02-38-12.814643.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_02T02_38_12.814643", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-02T02-38-12.814643.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-02T02-38-12.814643.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_02T02_38_12.814643", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T02-38-12.814643.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T02-38-12.814643.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_02T02_38_12.814643", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-02T02-38-12.814643.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-02T02-38-12.814643.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_02T02_38_12.814643", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T02-38-12.814643.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T02-38-12.814643.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_02T02_38_12.814643", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T02-38-12.814643.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T02-38-12.814643.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_02T02_38_12.814643", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-02T02-38-12.814643.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-02T02-38-12.814643.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_02T02_38_12.814643", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-02T02-38-12.814643.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-02T02-38-12.814643.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_02T02_38_12.814643", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-02T02-38-12.814643.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-02T02-38-12.814643.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_02T02_38_12.814643", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T02-38-12.814643.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T02-38-12.814643.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_02T02_38_12.814643", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-02T02-38-12.814643.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-02T02-38-12.814643.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_02T02_38_12.814643", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-02T02-38-12.814643.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-02T02-38-12.814643.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_02T02_38_12.814643", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-02T02-38-12.814643.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-02T02-38-12.814643.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_02T02_38_12.814643", "path": ["**/details_harness|winogrande|5_2024-02-02T02-38-12.814643.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-02T02-38-12.814643.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_02T02_38_12.814643", "path": ["results_2024-02-02T02-38-12.814643.parquet"]}, {"split": "latest", "path": ["results_2024-02-02T02-38-12.814643.parquet"]}]}]}
2024-02-02T02:40:55+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of Gille/StrangeMerges_2-7B-slerp Dataset automatically created during the evaluation run of model Gille/StrangeMerges_2-7B-slerp on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-02-02T02:38:12.814643(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of Gille/StrangeMerges_2-7B-slerp\n\n\n\nDataset automatically created during the evaluation run of model Gille/StrangeMerges_2-7B-slerp on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-02-02T02:38:12.814643(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of Gille/StrangeMerges_2-7B-slerp\n\n\n\nDataset automatically created during the evaluation run of model Gille/StrangeMerges_2-7B-slerp on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-02-02T02:38:12.814643(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
b30fc2b038cb7581a59daa7a84bf8bbb0bf40558
Table of Contents Dataset Description Dataset Summary Supported Tasks Languages Dataset Structure Data Instances Data Fields Data Splits Dataset Creation Curation Rationale Source Data Annotations Personal and Sensitive Information Considerations for Using the Data Social Impact of Dataset Discussion of Biases Other Known Limitations Additional Information Dataset Curators Licensing Information Citation Information **Dataset Description Dataset Summary This dataset comprises comments from IMDb on "Game of Thrones" episodes, including ratings (POINT), dates of the comments (DATE), titles of the episodes (TITLE), usernames (USER_NAME), and the content of the comments (CONTENT). Supported Tasks Sentiment Analysis: Determine the sentiment of comments. Text Classification: Classify comments by sentiment or episode. Language Modeling: Train models on entertainment-specific text. Languages The dataset is primarily in English. Dataset Structure Data Instances A data instance might look like this: json Copy code { "POINT": 8, "DATE": "2019-04-14", "TITLE": "Winterfell", "USER_NAME": "john_doe", "CONTENT": "Great episode but expected more from the storyline." } Data Fields POINT: Rating given by the user. DATE: Date when the comment was posted. TITLE: Title of the episode being commented on. USER_NAME: Username of the commenter. CONTENT: Text of the comment. Data Splits The dataset documentation should detail the division into training, validation, and test sets, if applicable. Dataset Creation Curation Rationale Curated to analyze viewer reactions towards "Game of Thrones" episodes, aiming to provide insights into the series' reception and engagement levels. Source Data Comments were collected from IMDb's episode pages for "Game of Thrones". Annotations The dataset does not include additional annotations beyond the user-provided ratings and comments. Personal and Sensitive Information Includes usernames that could be considered personal information. Users should handle this data responsibly. Considerations for Using the Data Social Impact of Dataset Facilitates understanding of viewer sentiments and can contribute to cultural impact studies on popular television series. Discussion of Biases May contain biases towards English-speaking and online-commenting populations. Other Known Limitations Sentiments expressed may not represent the broader audience's views accurately. Additional Information Dataset Curators Curated by Abdalrhman Alquaary in 2023. Licensing Information Specify the dataset's licensing here. Citation Information bibtex Copy code @misc{game_of_thrones_imdb_comments_2023, title={Game of Thrones Comments on IMDb}, author={Alquaary, Abdalrhman}, year={2023} }
ApoAlquaary/Game-of-Thrones-IMDB
[ "task_categories:text-classification", "task_categories:zero-shot-classification", "size_categories:10K<n<100K", "language:en", "region:us" ]
2024-02-02T02:45:26+00:00
{"language": ["en"], "size_categories": ["10K<n<100K"], "task_categories": ["text-classification", "zero-shot-classification"]}
2024-02-02T02:50:01+00:00
[]
[ "en" ]
TAGS #task_categories-text-classification #task_categories-zero-shot-classification #size_categories-10K<n<100K #language-English #region-us
Table of Contents Dataset Description Dataset Summary Supported Tasks Languages Dataset Structure Data Instances Data Fields Data Splits Dataset Creation Curation Rationale Source Data Annotations Personal and Sensitive Information Considerations for Using the Data Social Impact of Dataset Discussion of Biases Other Known Limitations Additional Information Dataset Curators Licensing Information Citation Information Dataset Description Dataset Summary This dataset comprises comments from IMDb on "Game of Thrones" episodes, including ratings (POINT), dates of the comments (DATE), titles of the episodes (TITLE), usernames (USER_NAME), and the content of the comments (CONTENT). Supported Tasks Sentiment Analysis: Determine the sentiment of comments. Text Classification: Classify comments by sentiment or episode. Language Modeling: Train models on entertainment-specific text. Languages The dataset is primarily in English. Dataset Structure Data Instances A data instance might look like this: json Copy code { "POINT": 8, "DATE": "2019-04-14", "TITLE": "Winterfell", "USER_NAME": "john_doe", "CONTENT": "Great episode but expected more from the storyline." } Data Fields POINT: Rating given by the user. DATE: Date when the comment was posted. TITLE: Title of the episode being commented on. USER_NAME: Username of the commenter. CONTENT: Text of the comment. Data Splits The dataset documentation should detail the division into training, validation, and test sets, if applicable. Dataset Creation Curation Rationale Curated to analyze viewer reactions towards "Game of Thrones" episodes, aiming to provide insights into the series' reception and engagement levels. Source Data Comments were collected from IMDb's episode pages for "Game of Thrones". Annotations The dataset does not include additional annotations beyond the user-provided ratings and comments. Personal and Sensitive Information Includes usernames that could be considered personal information. Users should handle this data responsibly. Considerations for Using the Data Social Impact of Dataset Facilitates understanding of viewer sentiments and can contribute to cultural impact studies on popular television series. Discussion of Biases May contain biases towards English-speaking and online-commenting populations. Other Known Limitations Sentiments expressed may not represent the broader audience's views accurately. Additional Information Dataset Curators Curated by Abdalrhman Alquaary in 2023. Licensing Information Specify the dataset's licensing here. Citation Information bibtex Copy code @misc{game_of_thrones_imdb_comments_2023, title={Game of Thrones Comments on IMDb}, author={Alquaary, Abdalrhman}, year={2023} }
[]
[ "TAGS\n#task_categories-text-classification #task_categories-zero-shot-classification #size_categories-10K<n<100K #language-English #region-us \n" ]
301225ac6c21744da0ac500e4843a4af25b6f0b7
# Dataset Card for Evaluation run of Gille/StrangeMerges_5-7B-ties <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [Gille/StrangeMerges_5-7B-ties](https://huggingface.co/Gille/StrangeMerges_5-7B-ties) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_Gille__StrangeMerges_5-7B-ties", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-02-02T02:44:27.733282](https://huggingface.co/datasets/open-llm-leaderboard/details_Gille__StrangeMerges_5-7B-ties/blob/main/results_2024-02-02T02-44-27.733282.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6545139389997998, "acc_stderr": 0.032091694452076346, "acc_norm": 0.6541682196117141, "acc_norm_stderr": 0.03275968368339009, "mc1": 0.5165238678090576, "mc1_stderr": 0.01749394019005772, "mc2": 0.6637291950615067, "mc2_stderr": 0.015304299142803788 }, "harness|arc:challenge|25": { "acc": 0.689419795221843, "acc_stderr": 0.013522292098053059, "acc_norm": 0.7167235494880546, "acc_norm_stderr": 0.013167478735134575 }, "harness|hellaswag|10": { "acc": 0.7105158334993029, "acc_stderr": 0.0045259609655517044, "acc_norm": 0.8788090021907986, "acc_norm_stderr": 0.003256821418857317 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.35, "acc_stderr": 0.0479372485441102, "acc_norm": 0.35, "acc_norm_stderr": 0.0479372485441102 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.6518518518518519, "acc_stderr": 0.041153246103369526, "acc_norm": 0.6518518518518519, "acc_norm_stderr": 0.041153246103369526 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.6842105263157895, "acc_stderr": 0.0378272898086547, "acc_norm": 0.6842105263157895, "acc_norm_stderr": 0.0378272898086547 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.65, "acc_stderr": 0.0479372485441102, "acc_norm": 0.65, "acc_norm_stderr": 0.0479372485441102 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.7320754716981132, "acc_stderr": 0.027257260322494845, "acc_norm": 0.7320754716981132, "acc_norm_stderr": 0.027257260322494845 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.7708333333333334, "acc_stderr": 0.03514697467862388, "acc_norm": 0.7708333333333334, "acc_norm_stderr": 0.03514697467862388 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.46, "acc_stderr": 0.05009082659620332, "acc_norm": 0.46, "acc_norm_stderr": 0.05009082659620332 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.54, "acc_stderr": 0.05009082659620333, "acc_norm": 0.54, "acc_norm_stderr": 0.05009082659620333 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.35, "acc_stderr": 0.047937248544110196, "acc_norm": 0.35, "acc_norm_stderr": 0.047937248544110196 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6763005780346821, "acc_stderr": 0.035676037996391706, "acc_norm": 0.6763005780346821, "acc_norm_stderr": 0.035676037996391706 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.39215686274509803, "acc_stderr": 0.04858083574266345, "acc_norm": 0.39215686274509803, "acc_norm_stderr": 0.04858083574266345 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.74, "acc_stderr": 0.04408440022768077, "acc_norm": 0.74, "acc_norm_stderr": 0.04408440022768077 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.6, "acc_stderr": 0.03202563076101735, "acc_norm": 0.6, "acc_norm_stderr": 0.03202563076101735 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.5, "acc_stderr": 0.047036043419179864, "acc_norm": 0.5, "acc_norm_stderr": 0.047036043419179864 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5586206896551724, "acc_stderr": 0.04137931034482757, "acc_norm": 0.5586206896551724, "acc_norm_stderr": 0.04137931034482757 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.41534391534391535, "acc_stderr": 0.025379524910778408, "acc_norm": 0.41534391534391535, "acc_norm_stderr": 0.025379524910778408 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.48412698412698413, "acc_stderr": 0.04469881854072606, "acc_norm": 0.48412698412698413, "acc_norm_stderr": 0.04469881854072606 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.36, "acc_stderr": 0.048241815132442176, "acc_norm": 0.36, "acc_norm_stderr": 0.048241815132442176 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.7838709677419354, "acc_stderr": 0.02341529343356853, "acc_norm": 0.7838709677419354, "acc_norm_stderr": 0.02341529343356853 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.4975369458128079, "acc_stderr": 0.03517945038691063, "acc_norm": 0.4975369458128079, "acc_norm_stderr": 0.03517945038691063 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.69, "acc_stderr": 0.04648231987117316, "acc_norm": 0.69, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7757575757575758, "acc_stderr": 0.03256866661681102, "acc_norm": 0.7757575757575758, "acc_norm_stderr": 0.03256866661681102 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.7828282828282829, "acc_stderr": 0.02937661648494563, "acc_norm": 0.7828282828282829, "acc_norm_stderr": 0.02937661648494563 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.9119170984455959, "acc_stderr": 0.02045374660160103, "acc_norm": 0.9119170984455959, "acc_norm_stderr": 0.02045374660160103 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.6717948717948717, "acc_stderr": 0.023807633198657266, "acc_norm": 0.6717948717948717, "acc_norm_stderr": 0.023807633198657266 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.3333333333333333, "acc_stderr": 0.028742040903948485, "acc_norm": 0.3333333333333333, "acc_norm_stderr": 0.028742040903948485 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.6680672268907563, "acc_stderr": 0.03058869701378364, "acc_norm": 0.6680672268907563, "acc_norm_stderr": 0.03058869701378364 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.3509933774834437, "acc_stderr": 0.03896981964257375, "acc_norm": 0.3509933774834437, "acc_norm_stderr": 0.03896981964257375 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8477064220183487, "acc_stderr": 0.015405084393157074, "acc_norm": 0.8477064220183487, "acc_norm_stderr": 0.015405084393157074 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.5092592592592593, "acc_stderr": 0.034093869469927006, "acc_norm": 0.5092592592592593, "acc_norm_stderr": 0.034093869469927006 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.8480392156862745, "acc_stderr": 0.0251956584289318, "acc_norm": 0.8480392156862745, "acc_norm_stderr": 0.0251956584289318 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.7974683544303798, "acc_stderr": 0.026160568246601443, "acc_norm": 0.7974683544303798, "acc_norm_stderr": 0.026160568246601443 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.695067264573991, "acc_stderr": 0.030898610882477515, "acc_norm": 0.695067264573991, "acc_norm_stderr": 0.030898610882477515 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.8015267175572519, "acc_stderr": 0.03498149385462472, "acc_norm": 0.8015267175572519, "acc_norm_stderr": 0.03498149385462472 }, "harness|hendrycksTest-international_law|5": { "acc": 0.7933884297520661, "acc_stderr": 0.03695980128098824, "acc_norm": 0.7933884297520661, "acc_norm_stderr": 0.03695980128098824 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7870370370370371, "acc_stderr": 0.03957835471980979, "acc_norm": 0.7870370370370371, "acc_norm_stderr": 0.03957835471980979 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7607361963190185, "acc_stderr": 0.0335195387952127, "acc_norm": 0.7607361963190185, "acc_norm_stderr": 0.0335195387952127 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.44642857142857145, "acc_stderr": 0.04718471485219588, "acc_norm": 0.44642857142857145, "acc_norm_stderr": 0.04718471485219588 }, "harness|hendrycksTest-management|5": { "acc": 0.7669902912621359, "acc_stderr": 0.04185832598928315, "acc_norm": 0.7669902912621359, "acc_norm_stderr": 0.04185832598928315 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8846153846153846, "acc_stderr": 0.02093019318517933, "acc_norm": 0.8846153846153846, "acc_norm_stderr": 0.02093019318517933 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.71, "acc_stderr": 0.045604802157206845, "acc_norm": 0.71, "acc_norm_stderr": 0.045604802157206845 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8339719029374202, "acc_stderr": 0.013306478243066298, "acc_norm": 0.8339719029374202, "acc_norm_stderr": 0.013306478243066298 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.7427745664739884, "acc_stderr": 0.023532925431044287, "acc_norm": 0.7427745664739884, "acc_norm_stderr": 0.023532925431044287 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.4346368715083799, "acc_stderr": 0.01657899743549672, "acc_norm": 0.4346368715083799, "acc_norm_stderr": 0.01657899743549672 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.7156862745098039, "acc_stderr": 0.025829163272757482, "acc_norm": 0.7156862745098039, "acc_norm_stderr": 0.025829163272757482 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.7138263665594855, "acc_stderr": 0.025670259242188933, "acc_norm": 0.7138263665594855, "acc_norm_stderr": 0.025670259242188933 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.7376543209876543, "acc_stderr": 0.02447722285613511, "acc_norm": 0.7376543209876543, "acc_norm_stderr": 0.02447722285613511 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.49645390070921985, "acc_stderr": 0.02982674915328092, "acc_norm": 0.49645390070921985, "acc_norm_stderr": 0.02982674915328092 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.46870925684485004, "acc_stderr": 0.012745204626083136, "acc_norm": 0.46870925684485004, "acc_norm_stderr": 0.012745204626083136 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.6654411764705882, "acc_stderr": 0.028661996202335303, "acc_norm": 0.6654411764705882, "acc_norm_stderr": 0.028661996202335303 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6797385620915033, "acc_stderr": 0.018875682938069443, "acc_norm": 0.6797385620915033, "acc_norm_stderr": 0.018875682938069443 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6909090909090909, "acc_stderr": 0.044262946482000985, "acc_norm": 0.6909090909090909, "acc_norm_stderr": 0.044262946482000985 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.7306122448979592, "acc_stderr": 0.02840125202902294, "acc_norm": 0.7306122448979592, "acc_norm_stderr": 0.02840125202902294 }, "harness|hendrycksTest-sociology|5": { "acc": 0.835820895522388, "acc_stderr": 0.026193923544454125, "acc_norm": 0.835820895522388, "acc_norm_stderr": 0.026193923544454125 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.84, "acc_stderr": 0.03684529491774708, "acc_norm": 0.84, "acc_norm_stderr": 0.03684529491774708 }, "harness|hendrycksTest-virology|5": { "acc": 0.5421686746987951, "acc_stderr": 0.0387862677100236, "acc_norm": 0.5421686746987951, "acc_norm_stderr": 0.0387862677100236 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8421052631578947, "acc_stderr": 0.027966785859160893, "acc_norm": 0.8421052631578947, "acc_norm_stderr": 0.027966785859160893 }, "harness|truthfulqa:mc|0": { "mc1": 0.5165238678090576, "mc1_stderr": 0.01749394019005772, "mc2": 0.6637291950615067, "mc2_stderr": 0.015304299142803788 }, "harness|winogrande|5": { "acc": 0.8366219415943172, "acc_stderr": 0.010390695970273766 }, "harness|gsm8k|5": { "acc": 0.6884003032600455, "acc_stderr": 0.012757375376754938 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_Gille__StrangeMerges_5-7B-ties
[ "region:us" ]
2024-02-02T02:46:45+00:00
{"pretty_name": "Evaluation run of Gille/StrangeMerges_5-7B-ties", "dataset_summary": "Dataset automatically created during the evaluation run of model [Gille/StrangeMerges_5-7B-ties](https://huggingface.co/Gille/StrangeMerges_5-7B-ties) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Gille__StrangeMerges_5-7B-ties\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-02T02:44:27.733282](https://huggingface.co/datasets/open-llm-leaderboard/details_Gille__StrangeMerges_5-7B-ties/blob/main/results_2024-02-02T02-44-27.733282.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6545139389997998,\n \"acc_stderr\": 0.032091694452076346,\n \"acc_norm\": 0.6541682196117141,\n \"acc_norm_stderr\": 0.03275968368339009,\n \"mc1\": 0.5165238678090576,\n \"mc1_stderr\": 0.01749394019005772,\n \"mc2\": 0.6637291950615067,\n \"mc2_stderr\": 0.015304299142803788\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.689419795221843,\n \"acc_stderr\": 0.013522292098053059,\n \"acc_norm\": 0.7167235494880546,\n \"acc_norm_stderr\": 0.013167478735134575\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7105158334993029,\n \"acc_stderr\": 0.0045259609655517044,\n \"acc_norm\": 0.8788090021907986,\n \"acc_norm_stderr\": 0.003256821418857317\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6518518518518519,\n \"acc_stderr\": 0.041153246103369526,\n \"acc_norm\": 0.6518518518518519,\n \"acc_norm_stderr\": 0.041153246103369526\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6842105263157895,\n \"acc_stderr\": 0.0378272898086547,\n \"acc_norm\": 0.6842105263157895,\n \"acc_norm_stderr\": 0.0378272898086547\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.65,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7320754716981132,\n \"acc_stderr\": 0.027257260322494845,\n \"acc_norm\": 0.7320754716981132,\n \"acc_norm_stderr\": 0.027257260322494845\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7708333333333334,\n \"acc_stderr\": 0.03514697467862388,\n \"acc_norm\": 0.7708333333333334,\n \"acc_norm_stderr\": 0.03514697467862388\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6763005780346821,\n \"acc_stderr\": 0.035676037996391706,\n \"acc_norm\": 0.6763005780346821,\n \"acc_norm_stderr\": 0.035676037996391706\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.04858083574266345,\n \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.04858083574266345\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768077,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768077\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.03202563076101735,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.03202563076101735\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.047036043419179864,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.047036043419179864\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5586206896551724,\n \"acc_stderr\": 0.04137931034482757,\n \"acc_norm\": 0.5586206896551724,\n \"acc_norm_stderr\": 0.04137931034482757\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.41534391534391535,\n \"acc_stderr\": 0.025379524910778408,\n \"acc_norm\": 0.41534391534391535,\n \"acc_norm_stderr\": 0.025379524910778408\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.48412698412698413,\n \"acc_stderr\": 0.04469881854072606,\n \"acc_norm\": 0.48412698412698413,\n \"acc_norm_stderr\": 0.04469881854072606\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7838709677419354,\n \"acc_stderr\": 0.02341529343356853,\n \"acc_norm\": 0.7838709677419354,\n \"acc_norm_stderr\": 0.02341529343356853\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4975369458128079,\n \"acc_stderr\": 0.03517945038691063,\n \"acc_norm\": 0.4975369458128079,\n \"acc_norm_stderr\": 0.03517945038691063\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.03256866661681102,\n \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.03256866661681102\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7828282828282829,\n \"acc_stderr\": 0.02937661648494563,\n \"acc_norm\": 0.7828282828282829,\n \"acc_norm_stderr\": 0.02937661648494563\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9119170984455959,\n \"acc_stderr\": 0.02045374660160103,\n \"acc_norm\": 0.9119170984455959,\n \"acc_norm_stderr\": 0.02045374660160103\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6717948717948717,\n \"acc_stderr\": 0.023807633198657266,\n \"acc_norm\": 0.6717948717948717,\n \"acc_norm_stderr\": 0.023807633198657266\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.028742040903948485,\n \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.028742040903948485\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6680672268907563,\n \"acc_stderr\": 0.03058869701378364,\n \"acc_norm\": 0.6680672268907563,\n \"acc_norm_stderr\": 0.03058869701378364\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3509933774834437,\n \"acc_stderr\": 0.03896981964257375,\n \"acc_norm\": 0.3509933774834437,\n \"acc_norm_stderr\": 0.03896981964257375\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8477064220183487,\n \"acc_stderr\": 0.015405084393157074,\n \"acc_norm\": 0.8477064220183487,\n \"acc_norm_stderr\": 0.015405084393157074\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5092592592592593,\n \"acc_stderr\": 0.034093869469927006,\n \"acc_norm\": 0.5092592592592593,\n \"acc_norm_stderr\": 0.034093869469927006\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8480392156862745,\n \"acc_stderr\": 0.0251956584289318,\n \"acc_norm\": 0.8480392156862745,\n \"acc_norm_stderr\": 0.0251956584289318\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7974683544303798,\n \"acc_stderr\": 0.026160568246601443,\n \"acc_norm\": 0.7974683544303798,\n \"acc_norm_stderr\": 0.026160568246601443\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.695067264573991,\n \"acc_stderr\": 0.030898610882477515,\n \"acc_norm\": 0.695067264573991,\n \"acc_norm_stderr\": 0.030898610882477515\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8015267175572519,\n \"acc_stderr\": 0.03498149385462472,\n \"acc_norm\": 0.8015267175572519,\n \"acc_norm_stderr\": 0.03498149385462472\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7933884297520661,\n \"acc_stderr\": 0.03695980128098824,\n \"acc_norm\": 0.7933884297520661,\n \"acc_norm_stderr\": 0.03695980128098824\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n \"acc_stderr\": 0.03957835471980979,\n \"acc_norm\": 0.7870370370370371,\n \"acc_norm_stderr\": 0.03957835471980979\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7607361963190185,\n \"acc_stderr\": 0.0335195387952127,\n \"acc_norm\": 0.7607361963190185,\n \"acc_norm_stderr\": 0.0335195387952127\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.44642857142857145,\n \"acc_stderr\": 0.04718471485219588,\n \"acc_norm\": 0.44642857142857145,\n \"acc_norm_stderr\": 0.04718471485219588\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8846153846153846,\n \"acc_stderr\": 0.02093019318517933,\n \"acc_norm\": 0.8846153846153846,\n \"acc_norm_stderr\": 0.02093019318517933\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8339719029374202,\n \"acc_stderr\": 0.013306478243066298,\n \"acc_norm\": 0.8339719029374202,\n \"acc_norm_stderr\": 0.013306478243066298\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7427745664739884,\n \"acc_stderr\": 0.023532925431044287,\n \"acc_norm\": 0.7427745664739884,\n \"acc_norm_stderr\": 0.023532925431044287\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4346368715083799,\n \"acc_stderr\": 0.01657899743549672,\n \"acc_norm\": 0.4346368715083799,\n \"acc_norm_stderr\": 0.01657899743549672\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7156862745098039,\n \"acc_stderr\": 0.025829163272757482,\n \"acc_norm\": 0.7156862745098039,\n \"acc_norm_stderr\": 0.025829163272757482\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7138263665594855,\n \"acc_stderr\": 0.025670259242188933,\n \"acc_norm\": 0.7138263665594855,\n \"acc_norm_stderr\": 0.025670259242188933\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7376543209876543,\n \"acc_stderr\": 0.02447722285613511,\n \"acc_norm\": 0.7376543209876543,\n \"acc_norm_stderr\": 0.02447722285613511\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.49645390070921985,\n \"acc_stderr\": 0.02982674915328092,\n \"acc_norm\": 0.49645390070921985,\n \"acc_norm_stderr\": 0.02982674915328092\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46870925684485004,\n \"acc_stderr\": 0.012745204626083136,\n \"acc_norm\": 0.46870925684485004,\n \"acc_norm_stderr\": 0.012745204626083136\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6654411764705882,\n \"acc_stderr\": 0.028661996202335303,\n \"acc_norm\": 0.6654411764705882,\n \"acc_norm_stderr\": 0.028661996202335303\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6797385620915033,\n \"acc_stderr\": 0.018875682938069443,\n \"acc_norm\": 0.6797385620915033,\n \"acc_norm_stderr\": 0.018875682938069443\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6909090909090909,\n \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.6909090909090909,\n \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7306122448979592,\n \"acc_stderr\": 0.02840125202902294,\n \"acc_norm\": 0.7306122448979592,\n \"acc_norm_stderr\": 0.02840125202902294\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n \"acc_stderr\": 0.026193923544454125,\n \"acc_norm\": 0.835820895522388,\n \"acc_norm_stderr\": 0.026193923544454125\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774708,\n \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774708\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5421686746987951,\n \"acc_stderr\": 0.0387862677100236,\n \"acc_norm\": 0.5421686746987951,\n \"acc_norm_stderr\": 0.0387862677100236\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8421052631578947,\n \"acc_stderr\": 0.027966785859160893,\n \"acc_norm\": 0.8421052631578947,\n \"acc_norm_stderr\": 0.027966785859160893\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5165238678090576,\n \"mc1_stderr\": 0.01749394019005772,\n \"mc2\": 0.6637291950615067,\n \"mc2_stderr\": 0.015304299142803788\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8366219415943172,\n \"acc_stderr\": 0.010390695970273766\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6884003032600455,\n \"acc_stderr\": 0.012757375376754938\n }\n}\n```", "repo_url": "https://huggingface.co/Gille/StrangeMerges_5-7B-ties", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_02T02_44_27.733282", "path": ["**/details_harness|arc:challenge|25_2024-02-02T02-44-27.733282.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-02T02-44-27.733282.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_02T02_44_27.733282", "path": ["**/details_harness|gsm8k|5_2024-02-02T02-44-27.733282.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-02T02-44-27.733282.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_02T02_44_27.733282", "path": ["**/details_harness|hellaswag|10_2024-02-02T02-44-27.733282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-02T02-44-27.733282.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_02T02_44_27.733282", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T02-44-27.733282.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-02T02-44-27.733282.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-02T02-44-27.733282.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T02-44-27.733282.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T02-44-27.733282.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-02T02-44-27.733282.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T02-44-27.733282.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T02-44-27.733282.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T02-44-27.733282.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T02-44-27.733282.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-02T02-44-27.733282.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-02T02-44-27.733282.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T02-44-27.733282.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-02T02-44-27.733282.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T02-44-27.733282.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T02-44-27.733282.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T02-44-27.733282.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-02T02-44-27.733282.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T02-44-27.733282.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T02-44-27.733282.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T02-44-27.733282.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T02-44-27.733282.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T02-44-27.733282.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T02-44-27.733282.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T02-44-27.733282.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T02-44-27.733282.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T02-44-27.733282.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T02-44-27.733282.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T02-44-27.733282.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T02-44-27.733282.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T02-44-27.733282.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T02-44-27.733282.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-02T02-44-27.733282.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T02-44-27.733282.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-02T02-44-27.733282.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T02-44-27.733282.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T02-44-27.733282.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T02-44-27.733282.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-02T02-44-27.733282.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-02T02-44-27.733282.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T02-44-27.733282.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T02-44-27.733282.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T02-44-27.733282.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T02-44-27.733282.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-02T02-44-27.733282.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-02T02-44-27.733282.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-02T02-44-27.733282.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T02-44-27.733282.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-02T02-44-27.733282.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T02-44-27.733282.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T02-44-27.733282.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-02T02-44-27.733282.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-02T02-44-27.733282.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-02T02-44-27.733282.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T02-44-27.733282.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-02T02-44-27.733282.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-02T02-44-27.733282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T02-44-27.733282.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-02T02-44-27.733282.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-02T02-44-27.733282.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T02-44-27.733282.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T02-44-27.733282.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-02T02-44-27.733282.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T02-44-27.733282.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T02-44-27.733282.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T02-44-27.733282.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T02-44-27.733282.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-02T02-44-27.733282.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-02T02-44-27.733282.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T02-44-27.733282.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-02T02-44-27.733282.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T02-44-27.733282.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T02-44-27.733282.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T02-44-27.733282.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-02T02-44-27.733282.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T02-44-27.733282.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T02-44-27.733282.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T02-44-27.733282.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T02-44-27.733282.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T02-44-27.733282.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T02-44-27.733282.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T02-44-27.733282.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T02-44-27.733282.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T02-44-27.733282.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T02-44-27.733282.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T02-44-27.733282.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T02-44-27.733282.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T02-44-27.733282.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T02-44-27.733282.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-02T02-44-27.733282.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T02-44-27.733282.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-02T02-44-27.733282.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T02-44-27.733282.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T02-44-27.733282.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T02-44-27.733282.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-02T02-44-27.733282.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-02T02-44-27.733282.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T02-44-27.733282.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T02-44-27.733282.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T02-44-27.733282.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T02-44-27.733282.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-02T02-44-27.733282.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-02T02-44-27.733282.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-02T02-44-27.733282.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T02-44-27.733282.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-02T02-44-27.733282.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T02-44-27.733282.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T02-44-27.733282.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-02T02-44-27.733282.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-02T02-44-27.733282.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-02T02-44-27.733282.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T02-44-27.733282.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-02T02-44-27.733282.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-02T02-44-27.733282.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_02T02_44_27.733282", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T02-44-27.733282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T02-44-27.733282.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_02T02_44_27.733282", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-02T02-44-27.733282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-02T02-44-27.733282.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_02T02_44_27.733282", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-02T02-44-27.733282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-02T02-44-27.733282.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_02T02_44_27.733282", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T02-44-27.733282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T02-44-27.733282.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_02T02_44_27.733282", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T02-44-27.733282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T02-44-27.733282.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_02T02_44_27.733282", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-02T02-44-27.733282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-02T02-44-27.733282.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_02T02_44_27.733282", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T02-44-27.733282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T02-44-27.733282.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_02T02_44_27.733282", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T02-44-27.733282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T02-44-27.733282.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_02T02_44_27.733282", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T02-44-27.733282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T02-44-27.733282.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_02T02_44_27.733282", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T02-44-27.733282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T02-44-27.733282.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_02T02_44_27.733282", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-02T02-44-27.733282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-02T02-44-27.733282.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_02T02_44_27.733282", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-02T02-44-27.733282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-02T02-44-27.733282.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_02T02_44_27.733282", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T02-44-27.733282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T02-44-27.733282.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_02T02_44_27.733282", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-02T02-44-27.733282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-02T02-44-27.733282.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_02T02_44_27.733282", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T02-44-27.733282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T02-44-27.733282.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_02T02_44_27.733282", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T02-44-27.733282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T02-44-27.733282.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_02T02_44_27.733282", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T02-44-27.733282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T02-44-27.733282.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_02T02_44_27.733282", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-02T02-44-27.733282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-02T02-44-27.733282.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_02T02_44_27.733282", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T02-44-27.733282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T02-44-27.733282.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_02T02_44_27.733282", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T02-44-27.733282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T02-44-27.733282.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_02T02_44_27.733282", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T02-44-27.733282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T02-44-27.733282.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_02T02_44_27.733282", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T02-44-27.733282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T02-44-27.733282.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_02T02_44_27.733282", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T02-44-27.733282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T02-44-27.733282.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_02T02_44_27.733282", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T02-44-27.733282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T02-44-27.733282.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_02T02_44_27.733282", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T02-44-27.733282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T02-44-27.733282.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_02T02_44_27.733282", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T02-44-27.733282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T02-44-27.733282.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_02T02_44_27.733282", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T02-44-27.733282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T02-44-27.733282.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_02T02_44_27.733282", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T02-44-27.733282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T02-44-27.733282.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_02T02_44_27.733282", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T02-44-27.733282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T02-44-27.733282.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_02T02_44_27.733282", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T02-44-27.733282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T02-44-27.733282.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_02T02_44_27.733282", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T02-44-27.733282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T02-44-27.733282.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_02T02_44_27.733282", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T02-44-27.733282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T02-44-27.733282.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_02T02_44_27.733282", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-02T02-44-27.733282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-02T02-44-27.733282.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_02T02_44_27.733282", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T02-44-27.733282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T02-44-27.733282.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_02T02_44_27.733282", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-02T02-44-27.733282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-02T02-44-27.733282.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_02T02_44_27.733282", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T02-44-27.733282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T02-44-27.733282.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_02T02_44_27.733282", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T02-44-27.733282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T02-44-27.733282.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_02T02_44_27.733282", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T02-44-27.733282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T02-44-27.733282.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_02T02_44_27.733282", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-02T02-44-27.733282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-02T02-44-27.733282.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_02T02_44_27.733282", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-02T02-44-27.733282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-02T02-44-27.733282.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_02T02_44_27.733282", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T02-44-27.733282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T02-44-27.733282.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_02T02_44_27.733282", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T02-44-27.733282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T02-44-27.733282.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_02T02_44_27.733282", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T02-44-27.733282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T02-44-27.733282.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_02T02_44_27.733282", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T02-44-27.733282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T02-44-27.733282.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_02T02_44_27.733282", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-02T02-44-27.733282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-02T02-44-27.733282.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_02T02_44_27.733282", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-02T02-44-27.733282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-02T02-44-27.733282.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_02T02_44_27.733282", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-02T02-44-27.733282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-02T02-44-27.733282.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_02T02_44_27.733282", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T02-44-27.733282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T02-44-27.733282.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_02T02_44_27.733282", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-02T02-44-27.733282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-02T02-44-27.733282.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_02T02_44_27.733282", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T02-44-27.733282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T02-44-27.733282.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_02T02_44_27.733282", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T02-44-27.733282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T02-44-27.733282.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_02T02_44_27.733282", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-02T02-44-27.733282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-02T02-44-27.733282.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_02T02_44_27.733282", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-02T02-44-27.733282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-02T02-44-27.733282.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_02T02_44_27.733282", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-02T02-44-27.733282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-02T02-44-27.733282.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_02T02_44_27.733282", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T02-44-27.733282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T02-44-27.733282.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_02T02_44_27.733282", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-02T02-44-27.733282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-02T02-44-27.733282.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_02T02_44_27.733282", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-02T02-44-27.733282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-02T02-44-27.733282.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_02T02_44_27.733282", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-02T02-44-27.733282.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-02T02-44-27.733282.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_02T02_44_27.733282", "path": ["**/details_harness|winogrande|5_2024-02-02T02-44-27.733282.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-02T02-44-27.733282.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_02T02_44_27.733282", "path": ["results_2024-02-02T02-44-27.733282.parquet"]}, {"split": "latest", "path": ["results_2024-02-02T02-44-27.733282.parquet"]}]}]}
2024-02-02T02:47:11+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of Gille/StrangeMerges_5-7B-ties Dataset automatically created during the evaluation run of model Gille/StrangeMerges_5-7B-ties on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-02-02T02:44:27.733282(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of Gille/StrangeMerges_5-7B-ties\n\n\n\nDataset automatically created during the evaluation run of model Gille/StrangeMerges_5-7B-ties on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-02-02T02:44:27.733282(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of Gille/StrangeMerges_5-7B-ties\n\n\n\nDataset automatically created during the evaluation run of model Gille/StrangeMerges_5-7B-ties on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-02-02T02:44:27.733282(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
d444297515e11f606fc4a14ebbb05eb564b34975
# Dataset Card for Evaluation run of Gille/StrangeMerges_11-7B-slerp <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [Gille/StrangeMerges_11-7B-slerp](https://huggingface.co/Gille/StrangeMerges_11-7B-slerp) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_Gille__StrangeMerges_11-7B-slerp", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-02-02T02:49:24.432141](https://huggingface.co/datasets/open-llm-leaderboard/details_Gille__StrangeMerges_11-7B-slerp/blob/main/results_2024-02-02T02-49-24.432141.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6559449653551005, "acc_stderr": 0.03195634937592395, "acc_norm": 0.6556138392106509, "acc_norm_stderr": 0.03261766462313909, "mc1": 0.5581395348837209, "mc1_stderr": 0.01738476747898621, "mc2": 0.6981288800240135, "mc2_stderr": 0.014855600124591495 }, "harness|arc:challenge|25": { "acc": 0.7013651877133106, "acc_stderr": 0.013374078615068744, "acc_norm": 0.7252559726962458, "acc_norm_stderr": 0.013044617212771227 }, "harness|hellaswag|10": { "acc": 0.7052380003983271, "acc_stderr": 0.004550038968550622, "acc_norm": 0.8819956184027086, "acc_norm_stderr": 0.0032195397905004815 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.34, "acc_stderr": 0.04760952285695235, "acc_norm": 0.34, "acc_norm_stderr": 0.04760952285695235 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.6518518518518519, "acc_stderr": 0.041153246103369526, "acc_norm": 0.6518518518518519, "acc_norm_stderr": 0.041153246103369526 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.6907894736842105, "acc_stderr": 0.037610708698674805, "acc_norm": 0.6907894736842105, "acc_norm_stderr": 0.037610708698674805 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.66, "acc_stderr": 0.04760952285695238, "acc_norm": 0.66, "acc_norm_stderr": 0.04760952285695238 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.7169811320754716, "acc_stderr": 0.027724236492700918, "acc_norm": 0.7169811320754716, "acc_norm_stderr": 0.027724236492700918 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.7708333333333334, "acc_stderr": 0.03514697467862388, "acc_norm": 0.7708333333333334, "acc_norm_stderr": 0.03514697467862388 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.49, "acc_stderr": 0.05024183937956911, "acc_norm": 0.49, "acc_norm_stderr": 0.05024183937956911 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.54, "acc_stderr": 0.05009082659620333, "acc_norm": 0.54, "acc_norm_stderr": 0.05009082659620333 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.31, "acc_stderr": 0.04648231987117316, "acc_norm": 0.31, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6820809248554913, "acc_stderr": 0.0355068398916558, "acc_norm": 0.6820809248554913, "acc_norm_stderr": 0.0355068398916558 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.43137254901960786, "acc_stderr": 0.04928099597287534, "acc_norm": 0.43137254901960786, "acc_norm_stderr": 0.04928099597287534 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.78, "acc_stderr": 0.04163331998932263, "acc_norm": 0.78, "acc_norm_stderr": 0.04163331998932263 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.5574468085106383, "acc_stderr": 0.03246956919789958, "acc_norm": 0.5574468085106383, "acc_norm_stderr": 0.03246956919789958 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.47368421052631576, "acc_stderr": 0.04697085136647863, "acc_norm": 0.47368421052631576, "acc_norm_stderr": 0.04697085136647863 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5724137931034483, "acc_stderr": 0.04122737111370333, "acc_norm": 0.5724137931034483, "acc_norm_stderr": 0.04122737111370333 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.4126984126984127, "acc_stderr": 0.02535574126305527, "acc_norm": 0.4126984126984127, "acc_norm_stderr": 0.02535574126305527 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.46825396825396826, "acc_stderr": 0.04463112720677172, "acc_norm": 0.46825396825396826, "acc_norm_stderr": 0.04463112720677172 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.34, "acc_stderr": 0.04760952285695235, "acc_norm": 0.34, "acc_norm_stderr": 0.04760952285695235 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.8, "acc_stderr": 0.022755204959542946, "acc_norm": 0.8, "acc_norm_stderr": 0.022755204959542946 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.5172413793103449, "acc_stderr": 0.035158955511656986, "acc_norm": 0.5172413793103449, "acc_norm_stderr": 0.035158955511656986 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.69, "acc_stderr": 0.04648231987117316, "acc_norm": 0.69, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7636363636363637, "acc_stderr": 0.03317505930009181, "acc_norm": 0.7636363636363637, "acc_norm_stderr": 0.03317505930009181 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.8131313131313131, "acc_stderr": 0.027772533334218967, "acc_norm": 0.8131313131313131, "acc_norm_stderr": 0.027772533334218967 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.9119170984455959, "acc_stderr": 0.02045374660160103, "acc_norm": 0.9119170984455959, "acc_norm_stderr": 0.02045374660160103 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.6743589743589744, "acc_stderr": 0.02375966576741229, "acc_norm": 0.6743589743589744, "acc_norm_stderr": 0.02375966576741229 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.337037037037037, "acc_stderr": 0.02882088466625326, "acc_norm": 0.337037037037037, "acc_norm_stderr": 0.02882088466625326 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.6638655462184874, "acc_stderr": 0.030684737115135356, "acc_norm": 0.6638655462184874, "acc_norm_stderr": 0.030684737115135356 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.36423841059602646, "acc_stderr": 0.03929111781242742, "acc_norm": 0.36423841059602646, "acc_norm_stderr": 0.03929111781242742 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8385321100917431, "acc_stderr": 0.015776239256163244, "acc_norm": 0.8385321100917431, "acc_norm_stderr": 0.015776239256163244 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.5138888888888888, "acc_stderr": 0.03408655867977749, "acc_norm": 0.5138888888888888, "acc_norm_stderr": 0.03408655867977749 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.8480392156862745, "acc_stderr": 0.025195658428931792, "acc_norm": 0.8480392156862745, "acc_norm_stderr": 0.025195658428931792 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.7890295358649789, "acc_stderr": 0.026558372502661916, "acc_norm": 0.7890295358649789, "acc_norm_stderr": 0.026558372502661916 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6905829596412556, "acc_stderr": 0.03102441174057221, "acc_norm": 0.6905829596412556, "acc_norm_stderr": 0.03102441174057221 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.7862595419847328, "acc_stderr": 0.0359546161177469, "acc_norm": 0.7862595419847328, "acc_norm_stderr": 0.0359546161177469 }, "harness|hendrycksTest-international_law|5": { "acc": 0.7851239669421488, "acc_stderr": 0.037494924487096966, "acc_norm": 0.7851239669421488, "acc_norm_stderr": 0.037494924487096966 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7870370370370371, "acc_stderr": 0.0395783547198098, "acc_norm": 0.7870370370370371, "acc_norm_stderr": 0.0395783547198098 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7791411042944786, "acc_stderr": 0.03259177392742178, "acc_norm": 0.7791411042944786, "acc_norm_stderr": 0.03259177392742178 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.42857142857142855, "acc_stderr": 0.04697113923010212, "acc_norm": 0.42857142857142855, "acc_norm_stderr": 0.04697113923010212 }, "harness|hendrycksTest-management|5": { "acc": 0.7669902912621359, "acc_stderr": 0.04185832598928315, "acc_norm": 0.7669902912621359, "acc_norm_stderr": 0.04185832598928315 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8846153846153846, "acc_stderr": 0.02093019318517933, "acc_norm": 0.8846153846153846, "acc_norm_stderr": 0.02093019318517933 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.7, "acc_stderr": 0.046056618647183814, "acc_norm": 0.7, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8288633461047255, "acc_stderr": 0.0134682016140663, "acc_norm": 0.8288633461047255, "acc_norm_stderr": 0.0134682016140663 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.7427745664739884, "acc_stderr": 0.023532925431044287, "acc_norm": 0.7427745664739884, "acc_norm_stderr": 0.023532925431044287 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.42905027932960893, "acc_stderr": 0.016553287863116037, "acc_norm": 0.42905027932960893, "acc_norm_stderr": 0.016553287863116037 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.7222222222222222, "acc_stderr": 0.025646863097137894, "acc_norm": 0.7222222222222222, "acc_norm_stderr": 0.025646863097137894 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.7138263665594855, "acc_stderr": 0.025670259242188933, "acc_norm": 0.7138263665594855, "acc_norm_stderr": 0.025670259242188933 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.7438271604938271, "acc_stderr": 0.0242885336377261, "acc_norm": 0.7438271604938271, "acc_norm_stderr": 0.0242885336377261 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.4858156028368794, "acc_stderr": 0.02981549448368206, "acc_norm": 0.4858156028368794, "acc_norm_stderr": 0.02981549448368206 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.47392438070404175, "acc_stderr": 0.012752858346533126, "acc_norm": 0.47392438070404175, "acc_norm_stderr": 0.012752858346533126 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.6911764705882353, "acc_stderr": 0.02806499816704009, "acc_norm": 0.6911764705882353, "acc_norm_stderr": 0.02806499816704009 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6715686274509803, "acc_stderr": 0.018999707383162673, "acc_norm": 0.6715686274509803, "acc_norm_stderr": 0.018999707383162673 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6545454545454545, "acc_stderr": 0.04554619617541054, "acc_norm": 0.6545454545454545, "acc_norm_stderr": 0.04554619617541054 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.7591836734693878, "acc_stderr": 0.02737294220178816, "acc_norm": 0.7591836734693878, "acc_norm_stderr": 0.02737294220178816 }, "harness|hendrycksTest-sociology|5": { "acc": 0.8407960199004975, "acc_stderr": 0.025870646766169146, "acc_norm": 0.8407960199004975, "acc_norm_stderr": 0.025870646766169146 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.88, "acc_stderr": 0.03265986323710906, "acc_norm": 0.88, "acc_norm_stderr": 0.03265986323710906 }, "harness|hendrycksTest-virology|5": { "acc": 0.5843373493975904, "acc_stderr": 0.03836722176598052, "acc_norm": 0.5843373493975904, "acc_norm_stderr": 0.03836722176598052 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8304093567251462, "acc_stderr": 0.02878210810540171, "acc_norm": 0.8304093567251462, "acc_norm_stderr": 0.02878210810540171 }, "harness|truthfulqa:mc|0": { "mc1": 0.5581395348837209, "mc1_stderr": 0.01738476747898621, "mc2": 0.6981288800240135, "mc2_stderr": 0.014855600124591495 }, "harness|winogrande|5": { "acc": 0.8232044198895028, "acc_stderr": 0.010721923287918742 }, "harness|gsm8k|5": { "acc": 0.7088703563305534, "acc_stderr": 0.012513215297888465 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_Gille__StrangeMerges_11-7B-slerp
[ "region:us" ]
2024-02-02T02:51:45+00:00
{"pretty_name": "Evaluation run of Gille/StrangeMerges_11-7B-slerp", "dataset_summary": "Dataset automatically created during the evaluation run of model [Gille/StrangeMerges_11-7B-slerp](https://huggingface.co/Gille/StrangeMerges_11-7B-slerp) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Gille__StrangeMerges_11-7B-slerp\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-02T02:49:24.432141](https://huggingface.co/datasets/open-llm-leaderboard/details_Gille__StrangeMerges_11-7B-slerp/blob/main/results_2024-02-02T02-49-24.432141.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6559449653551005,\n \"acc_stderr\": 0.03195634937592395,\n \"acc_norm\": 0.6556138392106509,\n \"acc_norm_stderr\": 0.03261766462313909,\n \"mc1\": 0.5581395348837209,\n \"mc1_stderr\": 0.01738476747898621,\n \"mc2\": 0.6981288800240135,\n \"mc2_stderr\": 0.014855600124591495\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.7013651877133106,\n \"acc_stderr\": 0.013374078615068744,\n \"acc_norm\": 0.7252559726962458,\n \"acc_norm_stderr\": 0.013044617212771227\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7052380003983271,\n \"acc_stderr\": 0.004550038968550622,\n \"acc_norm\": 0.8819956184027086,\n \"acc_norm_stderr\": 0.0032195397905004815\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6518518518518519,\n \"acc_stderr\": 0.041153246103369526,\n \"acc_norm\": 0.6518518518518519,\n \"acc_norm_stderr\": 0.041153246103369526\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6907894736842105,\n \"acc_stderr\": 0.037610708698674805,\n \"acc_norm\": 0.6907894736842105,\n \"acc_norm_stderr\": 0.037610708698674805\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695238,\n \"acc_norm\": 0.66,\n \"acc_norm_stderr\": 0.04760952285695238\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7169811320754716,\n \"acc_stderr\": 0.027724236492700918,\n \"acc_norm\": 0.7169811320754716,\n \"acc_norm_stderr\": 0.027724236492700918\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7708333333333334,\n \"acc_stderr\": 0.03514697467862388,\n \"acc_norm\": 0.7708333333333334,\n \"acc_norm_stderr\": 0.03514697467862388\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956911,\n \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956911\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6820809248554913,\n \"acc_stderr\": 0.0355068398916558,\n \"acc_norm\": 0.6820809248554913,\n \"acc_norm_stderr\": 0.0355068398916558\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.43137254901960786,\n \"acc_stderr\": 0.04928099597287534,\n \"acc_norm\": 0.43137254901960786,\n \"acc_norm_stderr\": 0.04928099597287534\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.78,\n \"acc_stderr\": 0.04163331998932263,\n \"acc_norm\": 0.78,\n \"acc_norm_stderr\": 0.04163331998932263\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5574468085106383,\n \"acc_stderr\": 0.03246956919789958,\n \"acc_norm\": 0.5574468085106383,\n \"acc_norm_stderr\": 0.03246956919789958\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.47368421052631576,\n \"acc_stderr\": 0.04697085136647863,\n \"acc_norm\": 0.47368421052631576,\n \"acc_norm_stderr\": 0.04697085136647863\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5724137931034483,\n \"acc_stderr\": 0.04122737111370333,\n \"acc_norm\": 0.5724137931034483,\n \"acc_norm_stderr\": 0.04122737111370333\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.4126984126984127,\n \"acc_stderr\": 0.02535574126305527,\n \"acc_norm\": 0.4126984126984127,\n \"acc_norm_stderr\": 0.02535574126305527\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.46825396825396826,\n \"acc_stderr\": 0.04463112720677172,\n \"acc_norm\": 0.46825396825396826,\n \"acc_norm_stderr\": 0.04463112720677172\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8,\n \"acc_stderr\": 0.022755204959542946,\n \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.022755204959542946\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.035158955511656986,\n \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.035158955511656986\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7636363636363637,\n \"acc_stderr\": 0.03317505930009181,\n \"acc_norm\": 0.7636363636363637,\n \"acc_norm_stderr\": 0.03317505930009181\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.8131313131313131,\n \"acc_stderr\": 0.027772533334218967,\n \"acc_norm\": 0.8131313131313131,\n \"acc_norm_stderr\": 0.027772533334218967\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9119170984455959,\n \"acc_stderr\": 0.02045374660160103,\n \"acc_norm\": 0.9119170984455959,\n \"acc_norm_stderr\": 0.02045374660160103\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6743589743589744,\n \"acc_stderr\": 0.02375966576741229,\n \"acc_norm\": 0.6743589743589744,\n \"acc_norm_stderr\": 0.02375966576741229\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.337037037037037,\n \"acc_stderr\": 0.02882088466625326,\n \"acc_norm\": 0.337037037037037,\n \"acc_norm_stderr\": 0.02882088466625326\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6638655462184874,\n \"acc_stderr\": 0.030684737115135356,\n \"acc_norm\": 0.6638655462184874,\n \"acc_norm_stderr\": 0.030684737115135356\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.36423841059602646,\n \"acc_stderr\": 0.03929111781242742,\n \"acc_norm\": 0.36423841059602646,\n \"acc_norm_stderr\": 0.03929111781242742\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8385321100917431,\n \"acc_stderr\": 0.015776239256163244,\n \"acc_norm\": 0.8385321100917431,\n \"acc_norm_stderr\": 0.015776239256163244\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5138888888888888,\n \"acc_stderr\": 0.03408655867977749,\n \"acc_norm\": 0.5138888888888888,\n \"acc_norm_stderr\": 0.03408655867977749\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8480392156862745,\n \"acc_stderr\": 0.025195658428931792,\n \"acc_norm\": 0.8480392156862745,\n \"acc_norm_stderr\": 0.025195658428931792\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7890295358649789,\n \"acc_stderr\": 0.026558372502661916,\n \"acc_norm\": 0.7890295358649789,\n \"acc_norm_stderr\": 0.026558372502661916\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n \"acc_stderr\": 0.03102441174057221,\n \"acc_norm\": 0.6905829596412556,\n \"acc_norm_stderr\": 0.03102441174057221\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7862595419847328,\n \"acc_stderr\": 0.0359546161177469,\n \"acc_norm\": 0.7862595419847328,\n \"acc_norm_stderr\": 0.0359546161177469\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n \"acc_stderr\": 0.0395783547198098,\n \"acc_norm\": 0.7870370370370371,\n \"acc_norm_stderr\": 0.0395783547198098\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7791411042944786,\n \"acc_stderr\": 0.03259177392742178,\n \"acc_norm\": 0.7791411042944786,\n \"acc_norm_stderr\": 0.03259177392742178\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.42857142857142855,\n \"acc_stderr\": 0.04697113923010212,\n \"acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.04697113923010212\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8846153846153846,\n \"acc_stderr\": 0.02093019318517933,\n \"acc_norm\": 0.8846153846153846,\n \"acc_norm_stderr\": 0.02093019318517933\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8288633461047255,\n \"acc_stderr\": 0.0134682016140663,\n \"acc_norm\": 0.8288633461047255,\n \"acc_norm_stderr\": 0.0134682016140663\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7427745664739884,\n \"acc_stderr\": 0.023532925431044287,\n \"acc_norm\": 0.7427745664739884,\n \"acc_norm_stderr\": 0.023532925431044287\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.42905027932960893,\n \"acc_stderr\": 0.016553287863116037,\n \"acc_norm\": 0.42905027932960893,\n \"acc_norm_stderr\": 0.016553287863116037\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.025646863097137894,\n \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.025646863097137894\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7138263665594855,\n \"acc_stderr\": 0.025670259242188933,\n \"acc_norm\": 0.7138263665594855,\n \"acc_norm_stderr\": 0.025670259242188933\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7438271604938271,\n \"acc_stderr\": 0.0242885336377261,\n \"acc_norm\": 0.7438271604938271,\n \"acc_norm_stderr\": 0.0242885336377261\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4858156028368794,\n \"acc_stderr\": 0.02981549448368206,\n \"acc_norm\": 0.4858156028368794,\n \"acc_norm_stderr\": 0.02981549448368206\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.47392438070404175,\n \"acc_stderr\": 0.012752858346533126,\n \"acc_norm\": 0.47392438070404175,\n \"acc_norm_stderr\": 0.012752858346533126\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6911764705882353,\n \"acc_stderr\": 0.02806499816704009,\n \"acc_norm\": 0.6911764705882353,\n \"acc_norm_stderr\": 0.02806499816704009\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6715686274509803,\n \"acc_stderr\": 0.018999707383162673,\n \"acc_norm\": 0.6715686274509803,\n \"acc_norm_stderr\": 0.018999707383162673\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7591836734693878,\n \"acc_stderr\": 0.02737294220178816,\n \"acc_norm\": 0.7591836734693878,\n \"acc_norm_stderr\": 0.02737294220178816\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n \"acc_stderr\": 0.025870646766169146,\n \"acc_norm\": 0.8407960199004975,\n \"acc_norm_stderr\": 0.025870646766169146\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.88,\n \"acc_stderr\": 0.03265986323710906,\n \"acc_norm\": 0.88,\n \"acc_norm_stderr\": 0.03265986323710906\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5843373493975904,\n \"acc_stderr\": 0.03836722176598052,\n \"acc_norm\": 0.5843373493975904,\n \"acc_norm_stderr\": 0.03836722176598052\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5581395348837209,\n \"mc1_stderr\": 0.01738476747898621,\n \"mc2\": 0.6981288800240135,\n \"mc2_stderr\": 0.014855600124591495\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8232044198895028,\n \"acc_stderr\": 0.010721923287918742\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7088703563305534,\n \"acc_stderr\": 0.012513215297888465\n }\n}\n```", "repo_url": "https://huggingface.co/Gille/StrangeMerges_11-7B-slerp", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_02T02_49_24.432141", "path": ["**/details_harness|arc:challenge|25_2024-02-02T02-49-24.432141.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-02T02-49-24.432141.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_02T02_49_24.432141", "path": ["**/details_harness|gsm8k|5_2024-02-02T02-49-24.432141.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-02T02-49-24.432141.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_02T02_49_24.432141", "path": ["**/details_harness|hellaswag|10_2024-02-02T02-49-24.432141.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-02T02-49-24.432141.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_02T02_49_24.432141", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T02-49-24.432141.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-02T02-49-24.432141.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-02T02-49-24.432141.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T02-49-24.432141.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T02-49-24.432141.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-02T02-49-24.432141.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T02-49-24.432141.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T02-49-24.432141.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T02-49-24.432141.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T02-49-24.432141.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-02T02-49-24.432141.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-02T02-49-24.432141.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T02-49-24.432141.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-02T02-49-24.432141.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T02-49-24.432141.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T02-49-24.432141.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T02-49-24.432141.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-02T02-49-24.432141.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T02-49-24.432141.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T02-49-24.432141.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T02-49-24.432141.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T02-49-24.432141.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T02-49-24.432141.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T02-49-24.432141.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T02-49-24.432141.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T02-49-24.432141.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T02-49-24.432141.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T02-49-24.432141.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T02-49-24.432141.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T02-49-24.432141.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T02-49-24.432141.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T02-49-24.432141.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-02T02-49-24.432141.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T02-49-24.432141.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-02T02-49-24.432141.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T02-49-24.432141.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T02-49-24.432141.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T02-49-24.432141.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-02T02-49-24.432141.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-02T02-49-24.432141.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T02-49-24.432141.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T02-49-24.432141.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T02-49-24.432141.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T02-49-24.432141.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-02T02-49-24.432141.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-02T02-49-24.432141.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-02T02-49-24.432141.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T02-49-24.432141.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-02T02-49-24.432141.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T02-49-24.432141.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T02-49-24.432141.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-02T02-49-24.432141.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-02T02-49-24.432141.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-02T02-49-24.432141.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T02-49-24.432141.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-02T02-49-24.432141.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-02T02-49-24.432141.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T02-49-24.432141.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-02T02-49-24.432141.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-02T02-49-24.432141.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T02-49-24.432141.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T02-49-24.432141.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-02T02-49-24.432141.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T02-49-24.432141.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T02-49-24.432141.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T02-49-24.432141.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T02-49-24.432141.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-02T02-49-24.432141.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-02T02-49-24.432141.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T02-49-24.432141.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-02T02-49-24.432141.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T02-49-24.432141.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T02-49-24.432141.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T02-49-24.432141.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-02T02-49-24.432141.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T02-49-24.432141.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T02-49-24.432141.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T02-49-24.432141.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T02-49-24.432141.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T02-49-24.432141.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T02-49-24.432141.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T02-49-24.432141.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T02-49-24.432141.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T02-49-24.432141.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T02-49-24.432141.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T02-49-24.432141.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T02-49-24.432141.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T02-49-24.432141.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T02-49-24.432141.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-02T02-49-24.432141.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T02-49-24.432141.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-02T02-49-24.432141.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T02-49-24.432141.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T02-49-24.432141.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T02-49-24.432141.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-02T02-49-24.432141.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-02T02-49-24.432141.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T02-49-24.432141.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T02-49-24.432141.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T02-49-24.432141.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T02-49-24.432141.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-02T02-49-24.432141.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-02T02-49-24.432141.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-02T02-49-24.432141.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T02-49-24.432141.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-02T02-49-24.432141.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T02-49-24.432141.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T02-49-24.432141.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-02T02-49-24.432141.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-02T02-49-24.432141.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-02T02-49-24.432141.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T02-49-24.432141.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-02T02-49-24.432141.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-02T02-49-24.432141.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_02T02_49_24.432141", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T02-49-24.432141.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T02-49-24.432141.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_02T02_49_24.432141", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-02T02-49-24.432141.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-02T02-49-24.432141.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_02T02_49_24.432141", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-02T02-49-24.432141.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-02T02-49-24.432141.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_02T02_49_24.432141", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T02-49-24.432141.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T02-49-24.432141.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_02T02_49_24.432141", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T02-49-24.432141.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T02-49-24.432141.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_02T02_49_24.432141", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-02T02-49-24.432141.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-02T02-49-24.432141.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_02T02_49_24.432141", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T02-49-24.432141.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T02-49-24.432141.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_02T02_49_24.432141", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T02-49-24.432141.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T02-49-24.432141.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_02T02_49_24.432141", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T02-49-24.432141.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T02-49-24.432141.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_02T02_49_24.432141", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T02-49-24.432141.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T02-49-24.432141.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_02T02_49_24.432141", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-02T02-49-24.432141.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-02T02-49-24.432141.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_02T02_49_24.432141", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-02T02-49-24.432141.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-02T02-49-24.432141.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_02T02_49_24.432141", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T02-49-24.432141.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T02-49-24.432141.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_02T02_49_24.432141", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-02T02-49-24.432141.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-02T02-49-24.432141.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_02T02_49_24.432141", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T02-49-24.432141.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T02-49-24.432141.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_02T02_49_24.432141", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T02-49-24.432141.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T02-49-24.432141.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_02T02_49_24.432141", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T02-49-24.432141.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T02-49-24.432141.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_02T02_49_24.432141", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-02T02-49-24.432141.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-02T02-49-24.432141.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_02T02_49_24.432141", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T02-49-24.432141.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T02-49-24.432141.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_02T02_49_24.432141", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T02-49-24.432141.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T02-49-24.432141.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_02T02_49_24.432141", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T02-49-24.432141.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T02-49-24.432141.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_02T02_49_24.432141", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T02-49-24.432141.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T02-49-24.432141.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_02T02_49_24.432141", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T02-49-24.432141.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T02-49-24.432141.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_02T02_49_24.432141", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T02-49-24.432141.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T02-49-24.432141.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_02T02_49_24.432141", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T02-49-24.432141.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T02-49-24.432141.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_02T02_49_24.432141", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T02-49-24.432141.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T02-49-24.432141.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_02T02_49_24.432141", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T02-49-24.432141.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T02-49-24.432141.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_02T02_49_24.432141", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T02-49-24.432141.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T02-49-24.432141.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_02T02_49_24.432141", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T02-49-24.432141.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T02-49-24.432141.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_02T02_49_24.432141", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T02-49-24.432141.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T02-49-24.432141.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_02T02_49_24.432141", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T02-49-24.432141.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T02-49-24.432141.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_02T02_49_24.432141", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T02-49-24.432141.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T02-49-24.432141.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_02T02_49_24.432141", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-02T02-49-24.432141.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-02T02-49-24.432141.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_02T02_49_24.432141", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T02-49-24.432141.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T02-49-24.432141.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_02T02_49_24.432141", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-02T02-49-24.432141.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-02T02-49-24.432141.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_02T02_49_24.432141", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T02-49-24.432141.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T02-49-24.432141.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_02T02_49_24.432141", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T02-49-24.432141.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T02-49-24.432141.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_02T02_49_24.432141", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T02-49-24.432141.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T02-49-24.432141.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_02T02_49_24.432141", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-02T02-49-24.432141.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-02T02-49-24.432141.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_02T02_49_24.432141", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-02T02-49-24.432141.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-02T02-49-24.432141.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_02T02_49_24.432141", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T02-49-24.432141.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T02-49-24.432141.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_02T02_49_24.432141", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T02-49-24.432141.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T02-49-24.432141.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_02T02_49_24.432141", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T02-49-24.432141.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T02-49-24.432141.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_02T02_49_24.432141", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T02-49-24.432141.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T02-49-24.432141.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_02T02_49_24.432141", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-02T02-49-24.432141.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-02T02-49-24.432141.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_02T02_49_24.432141", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-02T02-49-24.432141.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-02T02-49-24.432141.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_02T02_49_24.432141", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-02T02-49-24.432141.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-02T02-49-24.432141.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_02T02_49_24.432141", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T02-49-24.432141.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T02-49-24.432141.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_02T02_49_24.432141", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-02T02-49-24.432141.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-02T02-49-24.432141.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_02T02_49_24.432141", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T02-49-24.432141.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T02-49-24.432141.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_02T02_49_24.432141", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T02-49-24.432141.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T02-49-24.432141.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_02T02_49_24.432141", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-02T02-49-24.432141.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-02T02-49-24.432141.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_02T02_49_24.432141", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-02T02-49-24.432141.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-02T02-49-24.432141.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_02T02_49_24.432141", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-02T02-49-24.432141.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-02T02-49-24.432141.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_02T02_49_24.432141", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T02-49-24.432141.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T02-49-24.432141.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_02T02_49_24.432141", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-02T02-49-24.432141.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-02T02-49-24.432141.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_02T02_49_24.432141", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-02T02-49-24.432141.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-02T02-49-24.432141.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_02T02_49_24.432141", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-02T02-49-24.432141.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-02T02-49-24.432141.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_02T02_49_24.432141", "path": ["**/details_harness|winogrande|5_2024-02-02T02-49-24.432141.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-02T02-49-24.432141.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_02T02_49_24.432141", "path": ["results_2024-02-02T02-49-24.432141.parquet"]}, {"split": "latest", "path": ["results_2024-02-02T02-49-24.432141.parquet"]}]}]}
2024-02-02T02:52:14+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of Gille/StrangeMerges_11-7B-slerp Dataset automatically created during the evaluation run of model Gille/StrangeMerges_11-7B-slerp on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-02-02T02:49:24.432141(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of Gille/StrangeMerges_11-7B-slerp\n\n\n\nDataset automatically created during the evaluation run of model Gille/StrangeMerges_11-7B-slerp on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-02-02T02:49:24.432141(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of Gille/StrangeMerges_11-7B-slerp\n\n\n\nDataset automatically created during the evaluation run of model Gille/StrangeMerges_11-7B-slerp on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-02-02T02:49:24.432141(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
7eec3db1174e724dd49974e4727af571799bdcc7
# Dataset Card for Evaluation run of 152334H/miqu-1-70b-sf <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [152334H/miqu-1-70b-sf](https://huggingface.co/152334H/miqu-1-70b-sf) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_152334H__miqu-1-70b-sf", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-02-02T02:50:47.877017](https://huggingface.co/datasets/open-llm-leaderboard/details_152334H__miqu-1-70b-sf/blob/main/results_2024-02-02T02-50-47.877017.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.7535057558387624, "acc_stderr": 0.02844686425929854, "acc_norm": 0.7567310195499674, "acc_norm_stderr": 0.02899256949695357, "mc1": 0.5336597307221542, "mc1_stderr": 0.017463793867168103, "mc2": 0.693814109430027, "mc2_stderr": 0.014818261284964268 }, "harness|arc:challenge|25": { "acc": 0.6928327645051194, "acc_stderr": 0.013481034054980945, "acc_norm": 0.7303754266211604, "acc_norm_stderr": 0.012968040686869154 }, "harness|hellaswag|10": { "acc": 0.7101175064728141, "acc_stderr": 0.004527804016253783, "acc_norm": 0.8860784704242183, "acc_norm_stderr": 0.0031706661225176552 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.38, "acc_stderr": 0.048783173121456316, "acc_norm": 0.38, "acc_norm_stderr": 0.048783173121456316 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.7037037037037037, "acc_stderr": 0.03944624162501116, "acc_norm": 0.7037037037037037, "acc_norm_stderr": 0.03944624162501116 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.8289473684210527, "acc_stderr": 0.03064360707167709, "acc_norm": 0.8289473684210527, "acc_norm_stderr": 0.03064360707167709 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.82, "acc_stderr": 0.038612291966536955, "acc_norm": 0.82, "acc_norm_stderr": 0.038612291966536955 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.7886792452830189, "acc_stderr": 0.025125766484827845, "acc_norm": 0.7886792452830189, "acc_norm_stderr": 0.025125766484827845 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.8958333333333334, "acc_stderr": 0.02554523921025691, "acc_norm": 0.8958333333333334, "acc_norm_stderr": 0.02554523921025691 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.54, "acc_stderr": 0.05009082659620332, "acc_norm": 0.54, "acc_norm_stderr": 0.05009082659620332 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.62, "acc_stderr": 0.04878317312145632, "acc_norm": 0.62, "acc_norm_stderr": 0.04878317312145632 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.44, "acc_stderr": 0.049888765156985884, "acc_norm": 0.44, "acc_norm_stderr": 0.049888765156985884 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.7514450867052023, "acc_stderr": 0.03295304696818318, "acc_norm": 0.7514450867052023, "acc_norm_stderr": 0.03295304696818318 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.49019607843137253, "acc_stderr": 0.04974229460422817, "acc_norm": 0.49019607843137253, "acc_norm_stderr": 0.04974229460422817 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.85, "acc_stderr": 0.0358870281282637, "acc_norm": 0.85, "acc_norm_stderr": 0.0358870281282637 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.7361702127659574, "acc_stderr": 0.028809989854102956, "acc_norm": 0.7361702127659574, "acc_norm_stderr": 0.028809989854102956 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.6140350877192983, "acc_stderr": 0.04579639422070435, "acc_norm": 0.6140350877192983, "acc_norm_stderr": 0.04579639422070435 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.7310344827586207, "acc_stderr": 0.036951833116502325, "acc_norm": 0.7310344827586207, "acc_norm_stderr": 0.036951833116502325 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.5423280423280423, "acc_stderr": 0.02565886886205832, "acc_norm": 0.5423280423280423, "acc_norm_stderr": 0.02565886886205832 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.5555555555555556, "acc_stderr": 0.04444444444444449, "acc_norm": 0.5555555555555556, "acc_norm_stderr": 0.04444444444444449 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.51, "acc_stderr": 0.05024183937956911, "acc_norm": 0.51, "acc_norm_stderr": 0.05024183937956911 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.864516129032258, "acc_stderr": 0.019469334586486933, "acc_norm": 0.864516129032258, "acc_norm_stderr": 0.019469334586486933 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.645320197044335, "acc_stderr": 0.03366124489051449, "acc_norm": 0.645320197044335, "acc_norm_stderr": 0.03366124489051449 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.87, "acc_stderr": 0.03379976689896309, "acc_norm": 0.87, "acc_norm_stderr": 0.03379976689896309 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.8424242424242424, "acc_stderr": 0.028450388805284357, "acc_norm": 0.8424242424242424, "acc_norm_stderr": 0.028450388805284357 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.9090909090909091, "acc_stderr": 0.020482086775424218, "acc_norm": 0.9090909090909091, "acc_norm_stderr": 0.020482086775424218 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.9430051813471503, "acc_stderr": 0.016731085293607558, "acc_norm": 0.9430051813471503, "acc_norm_stderr": 0.016731085293607558 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.7923076923076923, "acc_stderr": 0.020567539567246815, "acc_norm": 0.7923076923076923, "acc_norm_stderr": 0.020567539567246815 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.4185185185185185, "acc_stderr": 0.030078013075022062, "acc_norm": 0.4185185185185185, "acc_norm_stderr": 0.030078013075022062 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.865546218487395, "acc_stderr": 0.022159373072744442, "acc_norm": 0.865546218487395, "acc_norm_stderr": 0.022159373072744442 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.5099337748344371, "acc_stderr": 0.04081677107248436, "acc_norm": 0.5099337748344371, "acc_norm_stderr": 0.04081677107248436 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.9174311926605505, "acc_stderr": 0.011800361363016569, "acc_norm": 0.9174311926605505, "acc_norm_stderr": 0.011800361363016569 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.7083333333333334, "acc_stderr": 0.030998666304560517, "acc_norm": 0.7083333333333334, "acc_norm_stderr": 0.030998666304560517 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.9264705882352942, "acc_stderr": 0.018318855850089678, "acc_norm": 0.9264705882352942, "acc_norm_stderr": 0.018318855850089678 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.9156118143459916, "acc_stderr": 0.018094247116473332, "acc_norm": 0.9156118143459916, "acc_norm_stderr": 0.018094247116473332 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.8116591928251121, "acc_stderr": 0.02624113299640726, "acc_norm": 0.8116591928251121, "acc_norm_stderr": 0.02624113299640726 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.8549618320610687, "acc_stderr": 0.03088466108951538, "acc_norm": 0.8549618320610687, "acc_norm_stderr": 0.03088466108951538 }, "harness|hendrycksTest-international_law|5": { "acc": 0.9338842975206612, "acc_stderr": 0.022683403691723305, "acc_norm": 0.9338842975206612, "acc_norm_stderr": 0.022683403691723305 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.8611111111111112, "acc_stderr": 0.03343270062869621, "acc_norm": 0.8611111111111112, "acc_norm_stderr": 0.03343270062869621 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.8282208588957055, "acc_stderr": 0.02963471727237103, "acc_norm": 0.8282208588957055, "acc_norm_stderr": 0.02963471727237103 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.6339285714285714, "acc_stderr": 0.0457237235873743, "acc_norm": 0.6339285714285714, "acc_norm_stderr": 0.0457237235873743 }, "harness|hendrycksTest-management|5": { "acc": 0.8640776699029126, "acc_stderr": 0.0339329572976101, "acc_norm": 0.8640776699029126, "acc_norm_stderr": 0.0339329572976101 }, "harness|hendrycksTest-marketing|5": { "acc": 0.9316239316239316, "acc_stderr": 0.016534627684311364, "acc_norm": 0.9316239316239316, "acc_norm_stderr": 0.016534627684311364 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.79, "acc_stderr": 0.040936018074033256, "acc_norm": 0.79, "acc_norm_stderr": 0.040936018074033256 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8939974457215837, "acc_stderr": 0.011008367705789363, "acc_norm": 0.8939974457215837, "acc_norm_stderr": 0.011008367705789363 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.8352601156069365, "acc_stderr": 0.019971040982442272, "acc_norm": 0.8352601156069365, "acc_norm_stderr": 0.019971040982442272 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.6581005586592179, "acc_stderr": 0.015864506461604654, "acc_norm": 0.6581005586592179, "acc_norm_stderr": 0.015864506461604654 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.826797385620915, "acc_stderr": 0.021668400256514307, "acc_norm": 0.826797385620915, "acc_norm_stderr": 0.021668400256514307 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.8231511254019293, "acc_stderr": 0.0216700588855108, "acc_norm": 0.8231511254019293, "acc_norm_stderr": 0.0216700588855108 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.8518518518518519, "acc_stderr": 0.019766459563597256, "acc_norm": 0.8518518518518519, "acc_norm_stderr": 0.019766459563597256 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.5957446808510638, "acc_stderr": 0.02927553215970472, "acc_norm": 0.5957446808510638, "acc_norm_stderr": 0.02927553215970472 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.5938722294654498, "acc_stderr": 0.01254315458841292, "acc_norm": 0.5938722294654498, "acc_norm_stderr": 0.01254315458841292 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.8235294117647058, "acc_stderr": 0.023157468308559345, "acc_norm": 0.8235294117647058, "acc_norm_stderr": 0.023157468308559345 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.8333333333333334, "acc_stderr": 0.015076937921915376, "acc_norm": 0.8333333333333334, "acc_norm_stderr": 0.015076937921915376 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.7090909090909091, "acc_stderr": 0.04350271442923243, "acc_norm": 0.7090909090909091, "acc_norm_stderr": 0.04350271442923243 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.8244897959183674, "acc_stderr": 0.024352800722970015, "acc_norm": 0.8244897959183674, "acc_norm_stderr": 0.024352800722970015 }, "harness|hendrycksTest-sociology|5": { "acc": 0.9203980099502488, "acc_stderr": 0.01913968563350382, "acc_norm": 0.9203980099502488, "acc_norm_stderr": 0.01913968563350382 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.95, "acc_stderr": 0.021904291355759057, "acc_norm": 0.95, "acc_norm_stderr": 0.021904291355759057 }, "harness|hendrycksTest-virology|5": { "acc": 0.5662650602409639, "acc_stderr": 0.03858158940685515, "acc_norm": 0.5662650602409639, "acc_norm_stderr": 0.03858158940685515 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8888888888888888, "acc_stderr": 0.024103384202072864, "acc_norm": 0.8888888888888888, "acc_norm_stderr": 0.024103384202072864 }, "harness|truthfulqa:mc|0": { "mc1": 0.5336597307221542, "mc1_stderr": 0.017463793867168103, "mc2": 0.693814109430027, "mc2_stderr": 0.014818261284964268 }, "harness|winogrande|5": { "acc": 0.8531965272296764, "acc_stderr": 0.009946627440250697 }, "harness|gsm8k|5": { "acc": 0.6770280515542078, "acc_stderr": 0.01288036079485182 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_152334H__miqu-1-70b-sf
[ "region:us" ]
2024-02-02T02:53:13+00:00
{"pretty_name": "Evaluation run of 152334H/miqu-1-70b-sf", "dataset_summary": "Dataset automatically created during the evaluation run of model [152334H/miqu-1-70b-sf](https://huggingface.co/152334H/miqu-1-70b-sf) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_152334H__miqu-1-70b-sf\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-02T02:50:47.877017](https://huggingface.co/datasets/open-llm-leaderboard/details_152334H__miqu-1-70b-sf/blob/main/results_2024-02-02T02-50-47.877017.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7535057558387624,\n \"acc_stderr\": 0.02844686425929854,\n \"acc_norm\": 0.7567310195499674,\n \"acc_norm_stderr\": 0.02899256949695357,\n \"mc1\": 0.5336597307221542,\n \"mc1_stderr\": 0.017463793867168103,\n \"mc2\": 0.693814109430027,\n \"mc2_stderr\": 0.014818261284964268\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6928327645051194,\n \"acc_stderr\": 0.013481034054980945,\n \"acc_norm\": 0.7303754266211604,\n \"acc_norm_stderr\": 0.012968040686869154\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7101175064728141,\n \"acc_stderr\": 0.004527804016253783,\n \"acc_norm\": 0.8860784704242183,\n \"acc_norm_stderr\": 0.0031706661225176552\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.7037037037037037,\n \"acc_stderr\": 0.03944624162501116,\n \"acc_norm\": 0.7037037037037037,\n \"acc_norm_stderr\": 0.03944624162501116\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.8289473684210527,\n \"acc_stderr\": 0.03064360707167709,\n \"acc_norm\": 0.8289473684210527,\n \"acc_norm_stderr\": 0.03064360707167709\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.82,\n \"acc_stderr\": 0.038612291966536955,\n \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.038612291966536955\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7886792452830189,\n \"acc_stderr\": 0.025125766484827845,\n \"acc_norm\": 0.7886792452830189,\n \"acc_norm_stderr\": 0.025125766484827845\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8958333333333334,\n \"acc_stderr\": 0.02554523921025691,\n \"acc_norm\": 0.8958333333333334,\n \"acc_norm_stderr\": 0.02554523921025691\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.62,\n \"acc_stderr\": 0.04878317312145632,\n \"acc_norm\": 0.62,\n \"acc_norm_stderr\": 0.04878317312145632\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.44,\n \"acc_stderr\": 0.049888765156985884,\n \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.049888765156985884\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.7514450867052023,\n \"acc_stderr\": 0.03295304696818318,\n \"acc_norm\": 0.7514450867052023,\n \"acc_norm_stderr\": 0.03295304696818318\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.49019607843137253,\n \"acc_stderr\": 0.04974229460422817,\n \"acc_norm\": 0.49019607843137253,\n \"acc_norm_stderr\": 0.04974229460422817\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.85,\n \"acc_stderr\": 0.0358870281282637,\n \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.0358870281282637\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.7361702127659574,\n \"acc_stderr\": 0.028809989854102956,\n \"acc_norm\": 0.7361702127659574,\n \"acc_norm_stderr\": 0.028809989854102956\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.6140350877192983,\n \"acc_stderr\": 0.04579639422070435,\n \"acc_norm\": 0.6140350877192983,\n \"acc_norm_stderr\": 0.04579639422070435\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.7310344827586207,\n \"acc_stderr\": 0.036951833116502325,\n \"acc_norm\": 0.7310344827586207,\n \"acc_norm_stderr\": 0.036951833116502325\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.5423280423280423,\n \"acc_stderr\": 0.02565886886205832,\n \"acc_norm\": 0.5423280423280423,\n \"acc_norm_stderr\": 0.02565886886205832\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5555555555555556,\n \"acc_stderr\": 0.04444444444444449,\n \"acc_norm\": 0.5555555555555556,\n \"acc_norm_stderr\": 0.04444444444444449\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956911,\n \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956911\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.864516129032258,\n \"acc_stderr\": 0.019469334586486933,\n \"acc_norm\": 0.864516129032258,\n \"acc_norm_stderr\": 0.019469334586486933\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.645320197044335,\n \"acc_stderr\": 0.03366124489051449,\n \"acc_norm\": 0.645320197044335,\n \"acc_norm_stderr\": 0.03366124489051449\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.87,\n \"acc_stderr\": 0.03379976689896309,\n \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.03379976689896309\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.8424242424242424,\n \"acc_stderr\": 0.028450388805284357,\n \"acc_norm\": 0.8424242424242424,\n \"acc_norm_stderr\": 0.028450388805284357\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.9090909090909091,\n \"acc_stderr\": 0.020482086775424218,\n \"acc_norm\": 0.9090909090909091,\n \"acc_norm_stderr\": 0.020482086775424218\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9430051813471503,\n \"acc_stderr\": 0.016731085293607558,\n \"acc_norm\": 0.9430051813471503,\n \"acc_norm_stderr\": 0.016731085293607558\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.7923076923076923,\n \"acc_stderr\": 0.020567539567246815,\n \"acc_norm\": 0.7923076923076923,\n \"acc_norm_stderr\": 0.020567539567246815\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.4185185185185185,\n \"acc_stderr\": 0.030078013075022062,\n \"acc_norm\": 0.4185185185185185,\n \"acc_norm_stderr\": 0.030078013075022062\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.865546218487395,\n \"acc_stderr\": 0.022159373072744442,\n \"acc_norm\": 0.865546218487395,\n \"acc_norm_stderr\": 0.022159373072744442\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.5099337748344371,\n \"acc_stderr\": 0.04081677107248436,\n \"acc_norm\": 0.5099337748344371,\n \"acc_norm_stderr\": 0.04081677107248436\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.9174311926605505,\n \"acc_stderr\": 0.011800361363016569,\n \"acc_norm\": 0.9174311926605505,\n \"acc_norm_stderr\": 0.011800361363016569\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.7083333333333334,\n \"acc_stderr\": 0.030998666304560517,\n \"acc_norm\": 0.7083333333333334,\n \"acc_norm_stderr\": 0.030998666304560517\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.9264705882352942,\n \"acc_stderr\": 0.018318855850089678,\n \"acc_norm\": 0.9264705882352942,\n \"acc_norm_stderr\": 0.018318855850089678\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.9156118143459916,\n \"acc_stderr\": 0.018094247116473332,\n \"acc_norm\": 0.9156118143459916,\n \"acc_norm_stderr\": 0.018094247116473332\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.8116591928251121,\n \"acc_stderr\": 0.02624113299640726,\n \"acc_norm\": 0.8116591928251121,\n \"acc_norm_stderr\": 0.02624113299640726\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8549618320610687,\n \"acc_stderr\": 0.03088466108951538,\n \"acc_norm\": 0.8549618320610687,\n \"acc_norm_stderr\": 0.03088466108951538\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.9338842975206612,\n \"acc_stderr\": 0.022683403691723305,\n \"acc_norm\": 0.9338842975206612,\n \"acc_norm_stderr\": 0.022683403691723305\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8611111111111112,\n \"acc_stderr\": 0.03343270062869621,\n \"acc_norm\": 0.8611111111111112,\n \"acc_norm_stderr\": 0.03343270062869621\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.8282208588957055,\n \"acc_stderr\": 0.02963471727237103,\n \"acc_norm\": 0.8282208588957055,\n \"acc_norm_stderr\": 0.02963471727237103\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.6339285714285714,\n \"acc_stderr\": 0.0457237235873743,\n \"acc_norm\": 0.6339285714285714,\n \"acc_norm_stderr\": 0.0457237235873743\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8640776699029126,\n \"acc_stderr\": 0.0339329572976101,\n \"acc_norm\": 0.8640776699029126,\n \"acc_norm_stderr\": 0.0339329572976101\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9316239316239316,\n \"acc_stderr\": 0.016534627684311364,\n \"acc_norm\": 0.9316239316239316,\n \"acc_norm_stderr\": 0.016534627684311364\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.79,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8939974457215837,\n \"acc_stderr\": 0.011008367705789363,\n \"acc_norm\": 0.8939974457215837,\n \"acc_norm_stderr\": 0.011008367705789363\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.8352601156069365,\n \"acc_stderr\": 0.019971040982442272,\n \"acc_norm\": 0.8352601156069365,\n \"acc_norm_stderr\": 0.019971040982442272\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.6581005586592179,\n \"acc_stderr\": 0.015864506461604654,\n \"acc_norm\": 0.6581005586592179,\n \"acc_norm_stderr\": 0.015864506461604654\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.826797385620915,\n \"acc_stderr\": 0.021668400256514307,\n \"acc_norm\": 0.826797385620915,\n \"acc_norm_stderr\": 0.021668400256514307\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.8231511254019293,\n \"acc_stderr\": 0.0216700588855108,\n \"acc_norm\": 0.8231511254019293,\n \"acc_norm_stderr\": 0.0216700588855108\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.8518518518518519,\n \"acc_stderr\": 0.019766459563597256,\n \"acc_norm\": 0.8518518518518519,\n \"acc_norm_stderr\": 0.019766459563597256\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.5957446808510638,\n \"acc_stderr\": 0.02927553215970472,\n \"acc_norm\": 0.5957446808510638,\n \"acc_norm_stderr\": 0.02927553215970472\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5938722294654498,\n \"acc_stderr\": 0.01254315458841292,\n \"acc_norm\": 0.5938722294654498,\n \"acc_norm_stderr\": 0.01254315458841292\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.8235294117647058,\n \"acc_stderr\": 0.023157468308559345,\n \"acc_norm\": 0.8235294117647058,\n \"acc_norm_stderr\": 0.023157468308559345\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.8333333333333334,\n \"acc_stderr\": 0.015076937921915376,\n \"acc_norm\": 0.8333333333333334,\n \"acc_norm_stderr\": 0.015076937921915376\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7090909090909091,\n \"acc_stderr\": 0.04350271442923243,\n \"acc_norm\": 0.7090909090909091,\n \"acc_norm_stderr\": 0.04350271442923243\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.8244897959183674,\n \"acc_stderr\": 0.024352800722970015,\n \"acc_norm\": 0.8244897959183674,\n \"acc_norm_stderr\": 0.024352800722970015\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.9203980099502488,\n \"acc_stderr\": 0.01913968563350382,\n \"acc_norm\": 0.9203980099502488,\n \"acc_norm_stderr\": 0.01913968563350382\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.95,\n \"acc_stderr\": 0.021904291355759057,\n \"acc_norm\": 0.95,\n \"acc_norm_stderr\": 0.021904291355759057\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5662650602409639,\n \"acc_stderr\": 0.03858158940685515,\n \"acc_norm\": 0.5662650602409639,\n \"acc_norm_stderr\": 0.03858158940685515\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8888888888888888,\n \"acc_stderr\": 0.024103384202072864,\n \"acc_norm\": 0.8888888888888888,\n \"acc_norm_stderr\": 0.024103384202072864\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5336597307221542,\n \"mc1_stderr\": 0.017463793867168103,\n \"mc2\": 0.693814109430027,\n \"mc2_stderr\": 0.014818261284964268\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8531965272296764,\n \"acc_stderr\": 0.009946627440250697\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6770280515542078,\n \"acc_stderr\": 0.01288036079485182\n }\n}\n```", "repo_url": "https://huggingface.co/152334H/miqu-1-70b-sf", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_02T02_50_47.877017", "path": ["**/details_harness|arc:challenge|25_2024-02-02T02-50-47.877017.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-02T02-50-47.877017.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_02T02_50_47.877017", "path": ["**/details_harness|gsm8k|5_2024-02-02T02-50-47.877017.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-02T02-50-47.877017.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_02T02_50_47.877017", "path": ["**/details_harness|hellaswag|10_2024-02-02T02-50-47.877017.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-02T02-50-47.877017.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_02T02_50_47.877017", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T02-50-47.877017.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-02T02-50-47.877017.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-02T02-50-47.877017.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T02-50-47.877017.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T02-50-47.877017.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-02T02-50-47.877017.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T02-50-47.877017.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T02-50-47.877017.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T02-50-47.877017.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T02-50-47.877017.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-02T02-50-47.877017.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-02T02-50-47.877017.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T02-50-47.877017.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-02T02-50-47.877017.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T02-50-47.877017.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T02-50-47.877017.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T02-50-47.877017.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-02T02-50-47.877017.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T02-50-47.877017.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T02-50-47.877017.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T02-50-47.877017.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T02-50-47.877017.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T02-50-47.877017.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T02-50-47.877017.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T02-50-47.877017.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T02-50-47.877017.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T02-50-47.877017.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T02-50-47.877017.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T02-50-47.877017.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T02-50-47.877017.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T02-50-47.877017.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T02-50-47.877017.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-02T02-50-47.877017.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T02-50-47.877017.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-02T02-50-47.877017.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T02-50-47.877017.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T02-50-47.877017.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T02-50-47.877017.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-02T02-50-47.877017.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-02T02-50-47.877017.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T02-50-47.877017.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T02-50-47.877017.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T02-50-47.877017.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T02-50-47.877017.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-02T02-50-47.877017.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-02T02-50-47.877017.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-02T02-50-47.877017.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T02-50-47.877017.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-02T02-50-47.877017.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T02-50-47.877017.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T02-50-47.877017.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-02T02-50-47.877017.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-02T02-50-47.877017.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-02T02-50-47.877017.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T02-50-47.877017.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-02T02-50-47.877017.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-02T02-50-47.877017.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T02-50-47.877017.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-02T02-50-47.877017.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-02T02-50-47.877017.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T02-50-47.877017.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T02-50-47.877017.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-02T02-50-47.877017.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T02-50-47.877017.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T02-50-47.877017.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T02-50-47.877017.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T02-50-47.877017.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-02T02-50-47.877017.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-02T02-50-47.877017.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T02-50-47.877017.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-02T02-50-47.877017.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T02-50-47.877017.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T02-50-47.877017.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T02-50-47.877017.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-02T02-50-47.877017.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T02-50-47.877017.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T02-50-47.877017.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T02-50-47.877017.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T02-50-47.877017.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T02-50-47.877017.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T02-50-47.877017.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T02-50-47.877017.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T02-50-47.877017.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T02-50-47.877017.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T02-50-47.877017.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T02-50-47.877017.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T02-50-47.877017.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T02-50-47.877017.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T02-50-47.877017.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-02T02-50-47.877017.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T02-50-47.877017.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-02T02-50-47.877017.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T02-50-47.877017.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T02-50-47.877017.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T02-50-47.877017.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-02T02-50-47.877017.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-02T02-50-47.877017.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T02-50-47.877017.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T02-50-47.877017.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T02-50-47.877017.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T02-50-47.877017.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-02T02-50-47.877017.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-02T02-50-47.877017.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-02T02-50-47.877017.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T02-50-47.877017.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-02T02-50-47.877017.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T02-50-47.877017.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T02-50-47.877017.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-02T02-50-47.877017.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-02T02-50-47.877017.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-02T02-50-47.877017.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T02-50-47.877017.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-02T02-50-47.877017.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-02T02-50-47.877017.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_02T02_50_47.877017", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T02-50-47.877017.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T02-50-47.877017.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_02T02_50_47.877017", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-02T02-50-47.877017.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-02T02-50-47.877017.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_02T02_50_47.877017", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-02T02-50-47.877017.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-02T02-50-47.877017.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_02T02_50_47.877017", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T02-50-47.877017.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T02-50-47.877017.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_02T02_50_47.877017", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T02-50-47.877017.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T02-50-47.877017.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_02T02_50_47.877017", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-02T02-50-47.877017.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-02T02-50-47.877017.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_02T02_50_47.877017", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T02-50-47.877017.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T02-50-47.877017.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_02T02_50_47.877017", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T02-50-47.877017.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T02-50-47.877017.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_02T02_50_47.877017", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T02-50-47.877017.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T02-50-47.877017.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_02T02_50_47.877017", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T02-50-47.877017.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T02-50-47.877017.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_02T02_50_47.877017", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-02T02-50-47.877017.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-02T02-50-47.877017.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_02T02_50_47.877017", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-02T02-50-47.877017.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-02T02-50-47.877017.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_02T02_50_47.877017", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T02-50-47.877017.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T02-50-47.877017.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_02T02_50_47.877017", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-02T02-50-47.877017.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-02T02-50-47.877017.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_02T02_50_47.877017", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T02-50-47.877017.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T02-50-47.877017.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_02T02_50_47.877017", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T02-50-47.877017.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T02-50-47.877017.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_02T02_50_47.877017", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T02-50-47.877017.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T02-50-47.877017.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_02T02_50_47.877017", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-02T02-50-47.877017.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-02T02-50-47.877017.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_02T02_50_47.877017", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T02-50-47.877017.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T02-50-47.877017.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_02T02_50_47.877017", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T02-50-47.877017.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T02-50-47.877017.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_02T02_50_47.877017", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T02-50-47.877017.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T02-50-47.877017.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_02T02_50_47.877017", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T02-50-47.877017.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T02-50-47.877017.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_02T02_50_47.877017", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T02-50-47.877017.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T02-50-47.877017.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_02T02_50_47.877017", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T02-50-47.877017.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T02-50-47.877017.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_02T02_50_47.877017", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T02-50-47.877017.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T02-50-47.877017.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_02T02_50_47.877017", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T02-50-47.877017.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T02-50-47.877017.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_02T02_50_47.877017", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T02-50-47.877017.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T02-50-47.877017.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_02T02_50_47.877017", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T02-50-47.877017.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T02-50-47.877017.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_02T02_50_47.877017", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T02-50-47.877017.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T02-50-47.877017.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_02T02_50_47.877017", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T02-50-47.877017.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T02-50-47.877017.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_02T02_50_47.877017", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T02-50-47.877017.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T02-50-47.877017.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_02T02_50_47.877017", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T02-50-47.877017.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T02-50-47.877017.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_02T02_50_47.877017", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-02T02-50-47.877017.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-02T02-50-47.877017.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_02T02_50_47.877017", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T02-50-47.877017.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T02-50-47.877017.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_02T02_50_47.877017", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-02T02-50-47.877017.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-02T02-50-47.877017.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_02T02_50_47.877017", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T02-50-47.877017.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T02-50-47.877017.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_02T02_50_47.877017", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T02-50-47.877017.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T02-50-47.877017.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_02T02_50_47.877017", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T02-50-47.877017.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T02-50-47.877017.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_02T02_50_47.877017", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-02T02-50-47.877017.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-02T02-50-47.877017.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_02T02_50_47.877017", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-02T02-50-47.877017.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-02T02-50-47.877017.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_02T02_50_47.877017", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T02-50-47.877017.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T02-50-47.877017.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_02T02_50_47.877017", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T02-50-47.877017.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T02-50-47.877017.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_02T02_50_47.877017", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T02-50-47.877017.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T02-50-47.877017.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_02T02_50_47.877017", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T02-50-47.877017.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T02-50-47.877017.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_02T02_50_47.877017", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-02T02-50-47.877017.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-02T02-50-47.877017.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_02T02_50_47.877017", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-02T02-50-47.877017.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-02T02-50-47.877017.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_02T02_50_47.877017", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-02T02-50-47.877017.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-02T02-50-47.877017.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_02T02_50_47.877017", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T02-50-47.877017.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T02-50-47.877017.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_02T02_50_47.877017", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-02T02-50-47.877017.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-02T02-50-47.877017.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_02T02_50_47.877017", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T02-50-47.877017.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T02-50-47.877017.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_02T02_50_47.877017", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T02-50-47.877017.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T02-50-47.877017.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_02T02_50_47.877017", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-02T02-50-47.877017.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-02T02-50-47.877017.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_02T02_50_47.877017", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-02T02-50-47.877017.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-02T02-50-47.877017.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_02T02_50_47.877017", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-02T02-50-47.877017.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-02T02-50-47.877017.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_02T02_50_47.877017", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T02-50-47.877017.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T02-50-47.877017.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_02T02_50_47.877017", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-02T02-50-47.877017.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-02T02-50-47.877017.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_02T02_50_47.877017", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-02T02-50-47.877017.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-02T02-50-47.877017.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_02T02_50_47.877017", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-02T02-50-47.877017.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-02T02-50-47.877017.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_02T02_50_47.877017", "path": ["**/details_harness|winogrande|5_2024-02-02T02-50-47.877017.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-02T02-50-47.877017.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_02T02_50_47.877017", "path": ["results_2024-02-02T02-50-47.877017.parquet"]}, {"split": "latest", "path": ["results_2024-02-02T02-50-47.877017.parquet"]}]}]}
2024-02-02T02:53:38+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of 152334H/miqu-1-70b-sf Dataset automatically created during the evaluation run of model 152334H/miqu-1-70b-sf on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-02-02T02:50:47.877017(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of 152334H/miqu-1-70b-sf\n\n\n\nDataset automatically created during the evaluation run of model 152334H/miqu-1-70b-sf on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-02-02T02:50:47.877017(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of 152334H/miqu-1-70b-sf\n\n\n\nDataset automatically created during the evaluation run of model 152334H/miqu-1-70b-sf on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-02-02T02:50:47.877017(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
7682d1ca4d44ddc88118bd02032bf78f13d7e177
This work was performed to help models with reasoning. I developed it working on my Cinder model, a STEM q and a model. Modified OpenORCA Step-by-Step Reasoning Dataset Overview The Modified OpenORCA Step-by-Step Reasoning Dataset represents a groundbreaking resource in the field of artificial intelligence, specifically designed to enhance the reasoning capabilities of AI models. This unique dataset is the result of a meticulous process of sorting, selecting, and altering dialogues from the original OpenORCA collection, with a focus on promoting an intrinsic approach to step-by-step logical reasoning across a wide array of topics. Dataset Composition Derived from the comprehensive OpenORCA dataset, this manually modified version strategically removes sections of prompts for step-by-step reasoning. Instead, it presents AI models with real-world scenarios requiring the deduction of logical steps to reach conclusions without explicit prompting to do so. Thereby encouraging models to develop a natural inclination towards systematic problem-solving. The dataset spans various domains, including but not limited to, everyday logical puzzles, basic mathematical problems, and complex scenario-based queries. Features Size: 92.4 MB, 64963 rows of dialogues that demonstrate step-by-step reasoning. Format: Available in JSON facilitating easy integration with common machine learning frameworks and environments. Content: Each entry includes a user query followed by a system-generated response that embodies step-by-step reasoning, without explicitly stating the requirement for such a process. This setup aims to train AI models to autonomously employ logical progression in their responses. Use Cases: Ideal for developing AI models geared towards natural language understanding, conversation AI, educational bots, and any application requiring a deep grasp of logical progression and problem-solving skills. Potential Applications AI Model Training: Serves as an invaluable tool for training and refining AI models, especially those focused on natural language processing, conversational intelligence, and automated reasoning. Educational Technology: Offers a rich resource for creating educational bots and tools designed to assist in teaching logical reasoning, critical thinking, and problem-solving strategies. Research and Development: Provides a robust foundation for academic and commercial research into improving step-by-step reasoning capabilities of AI systems, enhancing their ability to understand and interact with the world in a more human-like manner. Licensing and Accessibility This dataset is distributed under the MIT License, allowing for broad use, modification, and distribution, provided that the original license and copyright notices are included. This liberal licensing ensures that the Modified OpenORCA Step-by-Step Reasoning Dataset can be freely utilized by researchers, developers, and educators to advance the field of AI and develop applications that benefit from enhanced reasoning capabilities. For request, questions, support, or chat about current research, message me on Cinder's discord https://discord.gg/5ebjDrnZ Or email [email protected] Original Open Orca dataset: https://huggingface.co/datasets/Open-Orca/OpenOrca Inspired by the Microsoft unreleased datasets for Phi. Special thanks to the contributors of the original dataset Teknium WingLian/Caseus Eric Hartford NanoBit Pankaj Winddude Rohan http://AlignmentLab.ai: Autometa Entropi AtlasUnified NeverendingToast NanoBit WingLian/Caseus Also special thanks to TheBloke for supporting the community. Original Open Orca Citation: @misc{OpenOrca, title = {OpenOrca: An Open Dataset of GPT Augmented FLAN Reasoning Traces}, author = {Wing Lian and Bleys Goodson and Eugene Pentland and Austin Cook and Chanvichet Vong and "Teknium"}, year = {2023}, publisher = {HuggingFace}, journal = {HuggingFace repository}, howpublished = {\url{https://https://huggingface.co/Open-Orca/OpenOrca}}, } @misc{mukherjee2023orca, title={Orca: Progressive Learning from Complex Explanation Traces of GPT-4}, author={Subhabrata Mukherjee and Arindam Mitra and Ganesh Jawahar and Sahaj Agarwal and Hamid Palangi and Ahmed Awadallah}, year={2023}, eprint={2306.02707}, archivePrefix={arXiv}, primaryClass={cs.CL} } @misc{longpre2023flan, title={The Flan Collection: Designing Data and Methods for Effective Instruction Tuning}, author={Shayne Longpre and Le Hou and Tu Vu and Albert Webson and Hyung Won Chung and Yi Tay and Denny Zhou and Quoc V. Le and Barret Zoph and Jason Wei and Adam Roberts}, year={2023}, eprint={2301.13688}, archivePrefix={arXiv}, primaryClass={cs.AI} } @misc{touvron2023llama, title={Llama 2: Open Foundation and Fine-Tuned Chat Models}, author={Hugo Touvron and Louis Martin and Kevin Stone and Peter Albert and Amjad Almahairi and Yasmine Babaei and Nikolay Bashlykov and Soumya Batra and Prajjwal Bhargava and Shruti Bhosale and Dan Bikel and Lukas Blecher and Cristian Canton Ferrer and Moya Chen and Guillem Cucurull and David Esiobu and Jude Fernandes and Jeremy Fu and Wenyin Fu and Brian Fuller and Cynthia Gao and Vedanuj Goswami and Naman Goyal and Anthony Hartshorn and Saghar Hosseini and Rui Hou and Hakan Inan and Marcin Kardas and Viktor Kerkez and Madian Khabsa and Isabel Kloumann and Artem Korenev and Punit Singh Koura and Marie-Anne Lachaux and Thibaut Lavril and Jenya Lee and Diana Liskovich and Yinghai Lu and Yuning Mao and Xavier Martinet and Todor Mihaylov and Pushkar Mishra and Igor Molybog and Yixin Nie and Andrew Poulton and Jeremy Reizenstein and Rashi Rungta and Kalyan Saladi and Alan Schelten and Ruan Silva and Eric Michael Smith and Ranjan Subramanian and Xiaoqing Ellen Tan and Binh Tang and Ross Taylor and Adina Williams and Jian Xiang Kuan and Puxin Xu and Zheng Yan and Iliyan Zarov and Yuchen Zhang and Angela Fan and Melanie Kambadur and Sharan Narang and Aurelien Rodriguez and Robert Stojnic and Sergey Edunov and Thomas Scialom}, year={2023}, eprint= arXiv 2307.09288 } @software{touvron2023llama, title={LLaMA: Open and Efficient Foundation Language Models}, author={Touvron, Hugo and Lavril, Thibaut and Izacard, Gautier and Martinet, Xavier and Lachaux, Marie-Anne and Lacroix, Timoth{\'e}e and Rozi{\`e}re, Baptiste and Goyal, Naman and Hambro, Eric and Azhar, Faisal and Rodriguez, Aurelien and Joulin, Armand and Grave, Edouard and Lample, Guillaume}, journal={arXiv preprint arXiv:2302.13971}, year={2023} }
Josephgflowers/OpenOrca-Step-by-step-reasoning
[ "license:mit", "arxiv:2306.02707", "arxiv:2301.13688", "region:us" ]
2024-02-02T02:55:53+00:00
{"license": "mit"}
2024-02-02T16:26:12+00:00
[ "2306.02707", "2301.13688" ]
[]
TAGS #license-mit #arxiv-2306.02707 #arxiv-2301.13688 #region-us
This work was performed to help models with reasoning. I developed it working on my Cinder model, a STEM q and a model. Modified OpenORCA Step-by-Step Reasoning Dataset Overview The Modified OpenORCA Step-by-Step Reasoning Dataset represents a groundbreaking resource in the field of artificial intelligence, specifically designed to enhance the reasoning capabilities of AI models. This unique dataset is the result of a meticulous process of sorting, selecting, and altering dialogues from the original OpenORCA collection, with a focus on promoting an intrinsic approach to step-by-step logical reasoning across a wide array of topics. Dataset Composition Derived from the comprehensive OpenORCA dataset, this manually modified version strategically removes sections of prompts for step-by-step reasoning. Instead, it presents AI models with real-world scenarios requiring the deduction of logical steps to reach conclusions without explicit prompting to do so. Thereby encouraging models to develop a natural inclination towards systematic problem-solving. The dataset spans various domains, including but not limited to, everyday logical puzzles, basic mathematical problems, and complex scenario-based queries. Features Size: 92.4 MB, 64963 rows of dialogues that demonstrate step-by-step reasoning. Format: Available in JSON facilitating easy integration with common machine learning frameworks and environments. Content: Each entry includes a user query followed by a system-generated response that embodies step-by-step reasoning, without explicitly stating the requirement for such a process. This setup aims to train AI models to autonomously employ logical progression in their responses. Use Cases: Ideal for developing AI models geared towards natural language understanding, conversation AI, educational bots, and any application requiring a deep grasp of logical progression and problem-solving skills. Potential Applications AI Model Training: Serves as an invaluable tool for training and refining AI models, especially those focused on natural language processing, conversational intelligence, and automated reasoning. Educational Technology: Offers a rich resource for creating educational bots and tools designed to assist in teaching logical reasoning, critical thinking, and problem-solving strategies. Research and Development: Provides a robust foundation for academic and commercial research into improving step-by-step reasoning capabilities of AI systems, enhancing their ability to understand and interact with the world in a more human-like manner. Licensing and Accessibility This dataset is distributed under the MIT License, allowing for broad use, modification, and distribution, provided that the original license and copyright notices are included. This liberal licensing ensures that the Modified OpenORCA Step-by-Step Reasoning Dataset can be freely utilized by researchers, developers, and educators to advance the field of AI and develop applications that benefit from enhanced reasoning capabilities. For request, questions, support, or chat about current research, message me on Cinder's discord URL Or email Cinder-STEM@URL Original Open Orca dataset: URL Inspired by the Microsoft unreleased datasets for Phi. Special thanks to the contributors of the original dataset Teknium WingLian/Caseus Eric Hartford NanoBit Pankaj Winddude Rohan URL: Autometa Entropi AtlasUnified NeverendingToast NanoBit WingLian/Caseus Also special thanks to TheBloke for supporting the community. Original Open Orca Citation: @misc{OpenOrca, title = {OpenOrca: An Open Dataset of GPT Augmented FLAN Reasoning Traces}, author = {Wing Lian and Bleys Goodson and Eugene Pentland and Austin Cook and Chanvichet Vong and "Teknium"}, year = {2023}, publisher = {HuggingFace}, journal = {HuggingFace repository}, howpublished = {\url{https://URL } @misc{mukherjee2023orca, title={Orca: Progressive Learning from Complex Explanation Traces of GPT-4}, author={Subhabrata Mukherjee and Arindam Mitra and Ganesh Jawahar and Sahaj Agarwal and Hamid Palangi and Ahmed Awadallah}, year={2023}, eprint={2306.02707}, archivePrefix={arXiv}, primaryClass={cs.CL} } @misc{longpre2023flan, title={The Flan Collection: Designing Data and Methods for Effective Instruction Tuning}, author={Shayne Longpre and Le Hou and Tu Vu and Albert Webson and Hyung Won Chung and Yi Tay and Denny Zhou and Quoc V. Le and Barret Zoph and Jason Wei and Adam Roberts}, year={2023}, eprint={2301.13688}, archivePrefix={arXiv}, primaryClass={cs.AI} } @misc{touvron2023llama, title={Llama 2: Open Foundation and Fine-Tuned Chat Models}, author={Hugo Touvron and Louis Martin and Kevin Stone and Peter Albert and Amjad Almahairi and Yasmine Babaei and Nikolay Bashlykov and Soumya Batra and Prajjwal Bhargava and Shruti Bhosale and Dan Bikel and Lukas Blecher and Cristian Canton Ferrer and Moya Chen and Guillem Cucurull and David Esiobu and Jude Fernandes and Jeremy Fu and Wenyin Fu and Brian Fuller and Cynthia Gao and Vedanuj Goswami and Naman Goyal and Anthony Hartshorn and Saghar Hosseini and Rui Hou and Hakan Inan and Marcin Kardas and Viktor Kerkez and Madian Khabsa and Isabel Kloumann and Artem Korenev and Punit Singh Koura and Marie-Anne Lachaux and Thibaut Lavril and Jenya Lee and Diana Liskovich and Yinghai Lu and Yuning Mao and Xavier Martinet and Todor Mihaylov and Pushkar Mishra and Igor Molybog and Yixin Nie and Andrew Poulton and Jeremy Reizenstein and Rashi Rungta and Kalyan Saladi and Alan Schelten and Ruan Silva and Eric Michael Smith and Ranjan Subramanian and Xiaoqing Ellen Tan and Binh Tang and Ross Taylor and Adina Williams and Jian Xiang Kuan and Puxin Xu and Zheng Yan and Iliyan Zarov and Yuchen Zhang and Angela Fan and Melanie Kambadur and Sharan Narang and Aurelien Rodriguez and Robert Stojnic and Sergey Edunov and Thomas Scialom}, year={2023}, eprint= arXiv 2307.09288 } @software{touvron2023llama, title={LLaMA: Open and Efficient Foundation Language Models}, author={Touvron, Hugo and Lavril, Thibaut and Izacard, Gautier and Martinet, Xavier and Lachaux, Marie-Anne and Lacroix, Timoth{\'e}e and Rozi{\'e}re, Baptiste and Goyal, Naman and Hambro, Eric and Azhar, Faisal and Rodriguez, Aurelien and Joulin, Armand and Grave, Edouard and Lample, Guillaume}, journal={arXiv preprint arXiv:2302.13971}, year={2023} }
[]
[ "TAGS\n#license-mit #arxiv-2306.02707 #arxiv-2301.13688 #region-us \n" ]
cc6a516c3fafa21eb856aa9a767f1d1f0baf67da
# Dataset Card for Evaluation run of Gille/StrangeMerges_10-7B-slerp <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [Gille/StrangeMerges_10-7B-slerp](https://huggingface.co/Gille/StrangeMerges_10-7B-slerp) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_Gille__StrangeMerges_10-7B-slerp", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-02-02T02:55:04.492502](https://huggingface.co/datasets/open-llm-leaderboard/details_Gille__StrangeMerges_10-7B-slerp/blob/main/results_2024-02-02T02-55-04.492502.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6542458004549463, "acc_stderr": 0.03204861565652575, "acc_norm": 0.6539758320346176, "acc_norm_stderr": 0.03271443876560244, "mc1": 0.543451652386781, "mc1_stderr": 0.017437280953183688, "mc2": 0.6948877994288644, "mc2_stderr": 0.014809641585651314 }, "harness|arc:challenge|25": { "acc": 0.6919795221843004, "acc_stderr": 0.013491429517292038, "acc_norm": 0.7235494880546075, "acc_norm_stderr": 0.013069662474252423 }, "harness|hellaswag|10": { "acc": 0.7026488747261501, "acc_stderr": 0.004561582009834578, "acc_norm": 0.8829914359689305, "acc_norm_stderr": 0.0032077357692780455 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.34, "acc_stderr": 0.04760952285695235, "acc_norm": 0.34, "acc_norm_stderr": 0.04760952285695235 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.674074074074074, "acc_stderr": 0.040491220417025055, "acc_norm": 0.674074074074074, "acc_norm_stderr": 0.040491220417025055 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.7105263157894737, "acc_stderr": 0.03690677986137283, "acc_norm": 0.7105263157894737, "acc_norm_stderr": 0.03690677986137283 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.65, "acc_stderr": 0.0479372485441102, "acc_norm": 0.65, "acc_norm_stderr": 0.0479372485441102 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.7094339622641509, "acc_stderr": 0.027943219989337135, "acc_norm": 0.7094339622641509, "acc_norm_stderr": 0.027943219989337135 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.7638888888888888, "acc_stderr": 0.03551446610810826, "acc_norm": 0.7638888888888888, "acc_norm_stderr": 0.03551446610810826 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.48, "acc_stderr": 0.050211673156867795, "acc_norm": 0.48, "acc_norm_stderr": 0.050211673156867795 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.53, "acc_stderr": 0.050161355804659205, "acc_norm": 0.53, "acc_norm_stderr": 0.050161355804659205 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.31, "acc_stderr": 0.04648231987117316, "acc_norm": 0.31, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6589595375722543, "acc_stderr": 0.03614665424180826, "acc_norm": 0.6589595375722543, "acc_norm_stderr": 0.03614665424180826 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.46078431372549017, "acc_stderr": 0.04959859966384181, "acc_norm": 0.46078431372549017, "acc_norm_stderr": 0.04959859966384181 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.76, "acc_stderr": 0.04292346959909282, "acc_norm": 0.76, "acc_norm_stderr": 0.04292346959909282 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.5702127659574469, "acc_stderr": 0.03236214467715564, "acc_norm": 0.5702127659574469, "acc_norm_stderr": 0.03236214467715564 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.4824561403508772, "acc_stderr": 0.04700708033551038, "acc_norm": 0.4824561403508772, "acc_norm_stderr": 0.04700708033551038 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5517241379310345, "acc_stderr": 0.04144311810878152, "acc_norm": 0.5517241379310345, "acc_norm_stderr": 0.04144311810878152 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.4021164021164021, "acc_stderr": 0.02525303255499769, "acc_norm": 0.4021164021164021, "acc_norm_stderr": 0.02525303255499769 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.47619047619047616, "acc_stderr": 0.04467062628403273, "acc_norm": 0.47619047619047616, "acc_norm_stderr": 0.04467062628403273 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.32, "acc_stderr": 0.04688261722621504, "acc_norm": 0.32, "acc_norm_stderr": 0.04688261722621504 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.7967741935483871, "acc_stderr": 0.02289168798455496, "acc_norm": 0.7967741935483871, "acc_norm_stderr": 0.02289168798455496 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.5172413793103449, "acc_stderr": 0.035158955511656986, "acc_norm": 0.5172413793103449, "acc_norm_stderr": 0.035158955511656986 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.69, "acc_stderr": 0.04648231987117316, "acc_norm": 0.69, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7575757575757576, "acc_stderr": 0.03346409881055953, "acc_norm": 0.7575757575757576, "acc_norm_stderr": 0.03346409881055953 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.8181818181818182, "acc_stderr": 0.0274796030105388, "acc_norm": 0.8181818181818182, "acc_norm_stderr": 0.0274796030105388 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.9119170984455959, "acc_stderr": 0.02045374660160103, "acc_norm": 0.9119170984455959, "acc_norm_stderr": 0.02045374660160103 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.6717948717948717, "acc_stderr": 0.023807633198657266, "acc_norm": 0.6717948717948717, "acc_norm_stderr": 0.023807633198657266 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.3333333333333333, "acc_stderr": 0.028742040903948482, "acc_norm": 0.3333333333333333, "acc_norm_stderr": 0.028742040903948482 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.6764705882352942, "acc_stderr": 0.03038835355188679, "acc_norm": 0.6764705882352942, "acc_norm_stderr": 0.03038835355188679 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.3576158940397351, "acc_stderr": 0.03913453431177258, "acc_norm": 0.3576158940397351, "acc_norm_stderr": 0.03913453431177258 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8458715596330275, "acc_stderr": 0.015480826865374307, "acc_norm": 0.8458715596330275, "acc_norm_stderr": 0.015480826865374307 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.5185185185185185, "acc_stderr": 0.03407632093854051, "acc_norm": 0.5185185185185185, "acc_norm_stderr": 0.03407632093854051 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.8480392156862745, "acc_stderr": 0.025195658428931792, "acc_norm": 0.8480392156862745, "acc_norm_stderr": 0.025195658428931792 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.7890295358649789, "acc_stderr": 0.026558372502661916, "acc_norm": 0.7890295358649789, "acc_norm_stderr": 0.026558372502661916 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6860986547085202, "acc_stderr": 0.031146796482972465, "acc_norm": 0.6860986547085202, "acc_norm_stderr": 0.031146796482972465 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.7938931297709924, "acc_stderr": 0.03547771004159465, "acc_norm": 0.7938931297709924, "acc_norm_stderr": 0.03547771004159465 }, "harness|hendrycksTest-international_law|5": { "acc": 0.7768595041322314, "acc_stderr": 0.03800754475228733, "acc_norm": 0.7768595041322314, "acc_norm_stderr": 0.03800754475228733 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7685185185185185, "acc_stderr": 0.04077494709252627, "acc_norm": 0.7685185185185185, "acc_norm_stderr": 0.04077494709252627 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7607361963190185, "acc_stderr": 0.0335195387952127, "acc_norm": 0.7607361963190185, "acc_norm_stderr": 0.0335195387952127 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.44642857142857145, "acc_stderr": 0.04718471485219588, "acc_norm": 0.44642857142857145, "acc_norm_stderr": 0.04718471485219588 }, "harness|hendrycksTest-management|5": { "acc": 0.7572815533980582, "acc_stderr": 0.04245022486384495, "acc_norm": 0.7572815533980582, "acc_norm_stderr": 0.04245022486384495 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8760683760683761, "acc_stderr": 0.021586494001281376, "acc_norm": 0.8760683760683761, "acc_norm_stderr": 0.021586494001281376 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.7, "acc_stderr": 0.046056618647183814, "acc_norm": 0.7, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8275862068965517, "acc_stderr": 0.013507943909371803, "acc_norm": 0.8275862068965517, "acc_norm_stderr": 0.013507943909371803 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.7398843930635838, "acc_stderr": 0.023618678310069367, "acc_norm": 0.7398843930635838, "acc_norm_stderr": 0.023618678310069367 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.42793296089385474, "acc_stderr": 0.01654788799741611, "acc_norm": 0.42793296089385474, "acc_norm_stderr": 0.01654788799741611 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.7287581699346405, "acc_stderr": 0.02545775669666788, "acc_norm": 0.7287581699346405, "acc_norm_stderr": 0.02545775669666788 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.7041800643086816, "acc_stderr": 0.025922371788818763, "acc_norm": 0.7041800643086816, "acc_norm_stderr": 0.025922371788818763 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.75, "acc_stderr": 0.02409347123262133, "acc_norm": 0.75, "acc_norm_stderr": 0.02409347123262133 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.48936170212765956, "acc_stderr": 0.02982074719142248, "acc_norm": 0.48936170212765956, "acc_norm_stderr": 0.02982074719142248 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.47196870925684486, "acc_stderr": 0.012750151802922436, "acc_norm": 0.47196870925684486, "acc_norm_stderr": 0.012750151802922436 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.6838235294117647, "acc_stderr": 0.028245687391462923, "acc_norm": 0.6838235294117647, "acc_norm_stderr": 0.028245687391462923 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6781045751633987, "acc_stderr": 0.018901015322093092, "acc_norm": 0.6781045751633987, "acc_norm_stderr": 0.018901015322093092 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6818181818181818, "acc_stderr": 0.044612721759105085, "acc_norm": 0.6818181818181818, "acc_norm_stderr": 0.044612721759105085 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.7428571428571429, "acc_stderr": 0.02797982353874455, "acc_norm": 0.7428571428571429, "acc_norm_stderr": 0.02797982353874455 }, "harness|hendrycksTest-sociology|5": { "acc": 0.8407960199004975, "acc_stderr": 0.025870646766169136, "acc_norm": 0.8407960199004975, "acc_norm_stderr": 0.025870646766169136 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.86, "acc_stderr": 0.0348735088019777, "acc_norm": 0.86, "acc_norm_stderr": 0.0348735088019777 }, "harness|hendrycksTest-virology|5": { "acc": 0.5602409638554217, "acc_stderr": 0.03864139923699122, "acc_norm": 0.5602409638554217, "acc_norm_stderr": 0.03864139923699122 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8421052631578947, "acc_stderr": 0.027966785859160893, "acc_norm": 0.8421052631578947, "acc_norm_stderr": 0.027966785859160893 }, "harness|truthfulqa:mc|0": { "mc1": 0.543451652386781, "mc1_stderr": 0.017437280953183688, "mc2": 0.6948877994288644, "mc2_stderr": 0.014809641585651314 }, "harness|winogrande|5": { "acc": 0.835043409629045, "acc_stderr": 0.01043091746823743 }, "harness|gsm8k|5": { "acc": 0.7012888551933283, "acc_stderr": 0.012607137125693639 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_Gille__StrangeMerges_10-7B-slerp
[ "region:us" ]
2024-02-02T02:57:22+00:00
{"pretty_name": "Evaluation run of Gille/StrangeMerges_10-7B-slerp", "dataset_summary": "Dataset automatically created during the evaluation run of model [Gille/StrangeMerges_10-7B-slerp](https://huggingface.co/Gille/StrangeMerges_10-7B-slerp) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Gille__StrangeMerges_10-7B-slerp\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-02T02:55:04.492502](https://huggingface.co/datasets/open-llm-leaderboard/details_Gille__StrangeMerges_10-7B-slerp/blob/main/results_2024-02-02T02-55-04.492502.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6542458004549463,\n \"acc_stderr\": 0.03204861565652575,\n \"acc_norm\": 0.6539758320346176,\n \"acc_norm_stderr\": 0.03271443876560244,\n \"mc1\": 0.543451652386781,\n \"mc1_stderr\": 0.017437280953183688,\n \"mc2\": 0.6948877994288644,\n \"mc2_stderr\": 0.014809641585651314\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6919795221843004,\n \"acc_stderr\": 0.013491429517292038,\n \"acc_norm\": 0.7235494880546075,\n \"acc_norm_stderr\": 0.013069662474252423\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7026488747261501,\n \"acc_stderr\": 0.004561582009834578,\n \"acc_norm\": 0.8829914359689305,\n \"acc_norm_stderr\": 0.0032077357692780455\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.674074074074074,\n \"acc_stderr\": 0.040491220417025055,\n \"acc_norm\": 0.674074074074074,\n \"acc_norm_stderr\": 0.040491220417025055\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7105263157894737,\n \"acc_stderr\": 0.03690677986137283,\n \"acc_norm\": 0.7105263157894737,\n \"acc_norm_stderr\": 0.03690677986137283\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.65,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7094339622641509,\n \"acc_stderr\": 0.027943219989337135,\n \"acc_norm\": 0.7094339622641509,\n \"acc_norm_stderr\": 0.027943219989337135\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7638888888888888,\n \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.7638888888888888,\n \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.53,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6589595375722543,\n \"acc_stderr\": 0.03614665424180826,\n \"acc_norm\": 0.6589595375722543,\n \"acc_norm_stderr\": 0.03614665424180826\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.46078431372549017,\n \"acc_stderr\": 0.04959859966384181,\n \"acc_norm\": 0.46078431372549017,\n \"acc_norm_stderr\": 0.04959859966384181\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.04292346959909282,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.04292346959909282\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5702127659574469,\n \"acc_stderr\": 0.03236214467715564,\n \"acc_norm\": 0.5702127659574469,\n \"acc_norm_stderr\": 0.03236214467715564\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.4824561403508772,\n \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5517241379310345,\n \"acc_stderr\": 0.04144311810878152,\n \"acc_norm\": 0.5517241379310345,\n \"acc_norm_stderr\": 0.04144311810878152\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.4021164021164021,\n \"acc_stderr\": 0.02525303255499769,\n \"acc_norm\": 0.4021164021164021,\n \"acc_norm_stderr\": 0.02525303255499769\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.47619047619047616,\n \"acc_stderr\": 0.04467062628403273,\n \"acc_norm\": 0.47619047619047616,\n \"acc_norm_stderr\": 0.04467062628403273\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7967741935483871,\n \"acc_stderr\": 0.02289168798455496,\n \"acc_norm\": 0.7967741935483871,\n \"acc_norm_stderr\": 0.02289168798455496\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.035158955511656986,\n \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.035158955511656986\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7575757575757576,\n \"acc_stderr\": 0.03346409881055953,\n \"acc_norm\": 0.7575757575757576,\n \"acc_norm_stderr\": 0.03346409881055953\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.8181818181818182,\n \"acc_stderr\": 0.0274796030105388,\n \"acc_norm\": 0.8181818181818182,\n \"acc_norm_stderr\": 0.0274796030105388\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9119170984455959,\n \"acc_stderr\": 0.02045374660160103,\n \"acc_norm\": 0.9119170984455959,\n \"acc_norm_stderr\": 0.02045374660160103\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6717948717948717,\n \"acc_stderr\": 0.023807633198657266,\n \"acc_norm\": 0.6717948717948717,\n \"acc_norm_stderr\": 0.023807633198657266\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.028742040903948482,\n \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.028742040903948482\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.03038835355188679,\n \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.03038835355188679\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3576158940397351,\n \"acc_stderr\": 0.03913453431177258,\n \"acc_norm\": 0.3576158940397351,\n \"acc_norm_stderr\": 0.03913453431177258\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8458715596330275,\n \"acc_stderr\": 0.015480826865374307,\n \"acc_norm\": 0.8458715596330275,\n \"acc_norm_stderr\": 0.015480826865374307\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5185185185185185,\n \"acc_stderr\": 0.03407632093854051,\n \"acc_norm\": 0.5185185185185185,\n \"acc_norm_stderr\": 0.03407632093854051\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8480392156862745,\n \"acc_stderr\": 0.025195658428931792,\n \"acc_norm\": 0.8480392156862745,\n \"acc_norm_stderr\": 0.025195658428931792\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7890295358649789,\n \"acc_stderr\": 0.026558372502661916,\n \"acc_norm\": 0.7890295358649789,\n \"acc_norm_stderr\": 0.026558372502661916\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7938931297709924,\n \"acc_stderr\": 0.03547771004159465,\n \"acc_norm\": 0.7938931297709924,\n \"acc_norm_stderr\": 0.03547771004159465\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228733,\n \"acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228733\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n \"acc_stderr\": 0.04077494709252627,\n \"acc_norm\": 0.7685185185185185,\n \"acc_norm_stderr\": 0.04077494709252627\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7607361963190185,\n \"acc_stderr\": 0.0335195387952127,\n \"acc_norm\": 0.7607361963190185,\n \"acc_norm_stderr\": 0.0335195387952127\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.44642857142857145,\n \"acc_stderr\": 0.04718471485219588,\n \"acc_norm\": 0.44642857142857145,\n \"acc_norm_stderr\": 0.04718471485219588\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384495,\n \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384495\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8760683760683761,\n \"acc_stderr\": 0.021586494001281376,\n \"acc_norm\": 0.8760683760683761,\n \"acc_norm_stderr\": 0.021586494001281376\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8275862068965517,\n \"acc_stderr\": 0.013507943909371803,\n \"acc_norm\": 0.8275862068965517,\n \"acc_norm_stderr\": 0.013507943909371803\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7398843930635838,\n \"acc_stderr\": 0.023618678310069367,\n \"acc_norm\": 0.7398843930635838,\n \"acc_norm_stderr\": 0.023618678310069367\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.42793296089385474,\n \"acc_stderr\": 0.01654788799741611,\n \"acc_norm\": 0.42793296089385474,\n \"acc_norm_stderr\": 0.01654788799741611\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7287581699346405,\n \"acc_stderr\": 0.02545775669666788,\n \"acc_norm\": 0.7287581699346405,\n \"acc_norm_stderr\": 0.02545775669666788\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7041800643086816,\n \"acc_stderr\": 0.025922371788818763,\n \"acc_norm\": 0.7041800643086816,\n \"acc_norm_stderr\": 0.025922371788818763\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.02409347123262133,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.02409347123262133\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.48936170212765956,\n \"acc_stderr\": 0.02982074719142248,\n \"acc_norm\": 0.48936170212765956,\n \"acc_norm_stderr\": 0.02982074719142248\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.47196870925684486,\n \"acc_stderr\": 0.012750151802922436,\n \"acc_norm\": 0.47196870925684486,\n \"acc_norm_stderr\": 0.012750151802922436\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6838235294117647,\n \"acc_stderr\": 0.028245687391462923,\n \"acc_norm\": 0.6838235294117647,\n \"acc_norm_stderr\": 0.028245687391462923\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6781045751633987,\n \"acc_stderr\": 0.018901015322093092,\n \"acc_norm\": 0.6781045751633987,\n \"acc_norm_stderr\": 0.018901015322093092\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n \"acc_stderr\": 0.044612721759105085,\n \"acc_norm\": 0.6818181818181818,\n \"acc_norm_stderr\": 0.044612721759105085\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7428571428571429,\n \"acc_stderr\": 0.02797982353874455,\n \"acc_norm\": 0.7428571428571429,\n \"acc_norm_stderr\": 0.02797982353874455\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n \"acc_stderr\": 0.025870646766169136,\n \"acc_norm\": 0.8407960199004975,\n \"acc_norm_stderr\": 0.025870646766169136\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.86,\n \"acc_stderr\": 0.0348735088019777,\n \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.0348735088019777\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5602409638554217,\n \"acc_stderr\": 0.03864139923699122,\n \"acc_norm\": 0.5602409638554217,\n \"acc_norm_stderr\": 0.03864139923699122\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8421052631578947,\n \"acc_stderr\": 0.027966785859160893,\n \"acc_norm\": 0.8421052631578947,\n \"acc_norm_stderr\": 0.027966785859160893\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.543451652386781,\n \"mc1_stderr\": 0.017437280953183688,\n \"mc2\": 0.6948877994288644,\n \"mc2_stderr\": 0.014809641585651314\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.835043409629045,\n \"acc_stderr\": 0.01043091746823743\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7012888551933283,\n \"acc_stderr\": 0.012607137125693639\n }\n}\n```", "repo_url": "https://huggingface.co/Gille/StrangeMerges_10-7B-slerp", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_02T02_55_04.492502", "path": ["**/details_harness|arc:challenge|25_2024-02-02T02-55-04.492502.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-02T02-55-04.492502.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_02T02_55_04.492502", "path": ["**/details_harness|gsm8k|5_2024-02-02T02-55-04.492502.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-02T02-55-04.492502.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_02T02_55_04.492502", "path": ["**/details_harness|hellaswag|10_2024-02-02T02-55-04.492502.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-02T02-55-04.492502.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_02T02_55_04.492502", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T02-55-04.492502.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-02T02-55-04.492502.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-02T02-55-04.492502.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T02-55-04.492502.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T02-55-04.492502.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-02T02-55-04.492502.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T02-55-04.492502.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T02-55-04.492502.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T02-55-04.492502.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T02-55-04.492502.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-02T02-55-04.492502.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-02T02-55-04.492502.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T02-55-04.492502.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-02T02-55-04.492502.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T02-55-04.492502.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T02-55-04.492502.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T02-55-04.492502.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-02T02-55-04.492502.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T02-55-04.492502.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T02-55-04.492502.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T02-55-04.492502.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T02-55-04.492502.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T02-55-04.492502.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T02-55-04.492502.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T02-55-04.492502.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T02-55-04.492502.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T02-55-04.492502.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T02-55-04.492502.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T02-55-04.492502.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T02-55-04.492502.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T02-55-04.492502.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T02-55-04.492502.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-02T02-55-04.492502.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T02-55-04.492502.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-02T02-55-04.492502.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T02-55-04.492502.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T02-55-04.492502.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T02-55-04.492502.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-02T02-55-04.492502.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-02T02-55-04.492502.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T02-55-04.492502.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T02-55-04.492502.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T02-55-04.492502.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T02-55-04.492502.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-02T02-55-04.492502.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-02T02-55-04.492502.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-02T02-55-04.492502.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T02-55-04.492502.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-02T02-55-04.492502.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T02-55-04.492502.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T02-55-04.492502.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-02T02-55-04.492502.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-02T02-55-04.492502.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-02T02-55-04.492502.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T02-55-04.492502.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-02T02-55-04.492502.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-02T02-55-04.492502.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T02-55-04.492502.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-02T02-55-04.492502.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-02T02-55-04.492502.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T02-55-04.492502.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T02-55-04.492502.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-02T02-55-04.492502.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T02-55-04.492502.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T02-55-04.492502.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T02-55-04.492502.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T02-55-04.492502.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-02T02-55-04.492502.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-02T02-55-04.492502.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T02-55-04.492502.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-02T02-55-04.492502.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T02-55-04.492502.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T02-55-04.492502.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T02-55-04.492502.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-02T02-55-04.492502.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T02-55-04.492502.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T02-55-04.492502.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T02-55-04.492502.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T02-55-04.492502.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T02-55-04.492502.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T02-55-04.492502.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T02-55-04.492502.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T02-55-04.492502.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T02-55-04.492502.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T02-55-04.492502.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T02-55-04.492502.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T02-55-04.492502.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T02-55-04.492502.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T02-55-04.492502.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-02T02-55-04.492502.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T02-55-04.492502.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-02T02-55-04.492502.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T02-55-04.492502.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T02-55-04.492502.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T02-55-04.492502.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-02T02-55-04.492502.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-02T02-55-04.492502.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T02-55-04.492502.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T02-55-04.492502.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T02-55-04.492502.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T02-55-04.492502.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-02T02-55-04.492502.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-02T02-55-04.492502.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-02T02-55-04.492502.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T02-55-04.492502.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-02T02-55-04.492502.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T02-55-04.492502.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T02-55-04.492502.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-02T02-55-04.492502.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-02T02-55-04.492502.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-02T02-55-04.492502.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T02-55-04.492502.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-02T02-55-04.492502.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-02T02-55-04.492502.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_02T02_55_04.492502", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T02-55-04.492502.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T02-55-04.492502.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_02T02_55_04.492502", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-02T02-55-04.492502.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-02T02-55-04.492502.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_02T02_55_04.492502", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-02T02-55-04.492502.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-02T02-55-04.492502.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_02T02_55_04.492502", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T02-55-04.492502.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T02-55-04.492502.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_02T02_55_04.492502", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T02-55-04.492502.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T02-55-04.492502.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_02T02_55_04.492502", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-02T02-55-04.492502.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-02T02-55-04.492502.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_02T02_55_04.492502", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T02-55-04.492502.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T02-55-04.492502.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_02T02_55_04.492502", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T02-55-04.492502.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T02-55-04.492502.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_02T02_55_04.492502", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T02-55-04.492502.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T02-55-04.492502.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_02T02_55_04.492502", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T02-55-04.492502.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T02-55-04.492502.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_02T02_55_04.492502", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-02T02-55-04.492502.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-02T02-55-04.492502.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_02T02_55_04.492502", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-02T02-55-04.492502.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-02T02-55-04.492502.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_02T02_55_04.492502", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T02-55-04.492502.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T02-55-04.492502.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_02T02_55_04.492502", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-02T02-55-04.492502.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-02T02-55-04.492502.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_02T02_55_04.492502", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T02-55-04.492502.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T02-55-04.492502.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_02T02_55_04.492502", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T02-55-04.492502.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T02-55-04.492502.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_02T02_55_04.492502", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T02-55-04.492502.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T02-55-04.492502.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_02T02_55_04.492502", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-02T02-55-04.492502.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-02T02-55-04.492502.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_02T02_55_04.492502", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T02-55-04.492502.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T02-55-04.492502.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_02T02_55_04.492502", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T02-55-04.492502.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T02-55-04.492502.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_02T02_55_04.492502", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T02-55-04.492502.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T02-55-04.492502.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_02T02_55_04.492502", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T02-55-04.492502.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T02-55-04.492502.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_02T02_55_04.492502", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T02-55-04.492502.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T02-55-04.492502.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_02T02_55_04.492502", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T02-55-04.492502.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T02-55-04.492502.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_02T02_55_04.492502", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T02-55-04.492502.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T02-55-04.492502.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_02T02_55_04.492502", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T02-55-04.492502.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T02-55-04.492502.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_02T02_55_04.492502", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T02-55-04.492502.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T02-55-04.492502.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_02T02_55_04.492502", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T02-55-04.492502.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T02-55-04.492502.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_02T02_55_04.492502", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T02-55-04.492502.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T02-55-04.492502.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_02T02_55_04.492502", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T02-55-04.492502.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T02-55-04.492502.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_02T02_55_04.492502", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T02-55-04.492502.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T02-55-04.492502.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_02T02_55_04.492502", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T02-55-04.492502.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T02-55-04.492502.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_02T02_55_04.492502", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-02T02-55-04.492502.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-02T02-55-04.492502.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_02T02_55_04.492502", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T02-55-04.492502.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T02-55-04.492502.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_02T02_55_04.492502", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-02T02-55-04.492502.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-02T02-55-04.492502.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_02T02_55_04.492502", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T02-55-04.492502.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T02-55-04.492502.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_02T02_55_04.492502", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T02-55-04.492502.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T02-55-04.492502.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_02T02_55_04.492502", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T02-55-04.492502.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T02-55-04.492502.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_02T02_55_04.492502", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-02T02-55-04.492502.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-02T02-55-04.492502.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_02T02_55_04.492502", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-02T02-55-04.492502.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-02T02-55-04.492502.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_02T02_55_04.492502", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T02-55-04.492502.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T02-55-04.492502.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_02T02_55_04.492502", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T02-55-04.492502.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T02-55-04.492502.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_02T02_55_04.492502", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T02-55-04.492502.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T02-55-04.492502.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_02T02_55_04.492502", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T02-55-04.492502.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T02-55-04.492502.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_02T02_55_04.492502", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-02T02-55-04.492502.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-02T02-55-04.492502.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_02T02_55_04.492502", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-02T02-55-04.492502.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-02T02-55-04.492502.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_02T02_55_04.492502", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-02T02-55-04.492502.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-02T02-55-04.492502.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_02T02_55_04.492502", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T02-55-04.492502.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T02-55-04.492502.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_02T02_55_04.492502", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-02T02-55-04.492502.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-02T02-55-04.492502.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_02T02_55_04.492502", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T02-55-04.492502.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T02-55-04.492502.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_02T02_55_04.492502", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T02-55-04.492502.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T02-55-04.492502.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_02T02_55_04.492502", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-02T02-55-04.492502.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-02T02-55-04.492502.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_02T02_55_04.492502", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-02T02-55-04.492502.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-02T02-55-04.492502.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_02T02_55_04.492502", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-02T02-55-04.492502.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-02T02-55-04.492502.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_02T02_55_04.492502", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T02-55-04.492502.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T02-55-04.492502.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_02T02_55_04.492502", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-02T02-55-04.492502.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-02T02-55-04.492502.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_02T02_55_04.492502", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-02T02-55-04.492502.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-02T02-55-04.492502.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_02T02_55_04.492502", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-02T02-55-04.492502.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-02T02-55-04.492502.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_02T02_55_04.492502", "path": ["**/details_harness|winogrande|5_2024-02-02T02-55-04.492502.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-02T02-55-04.492502.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_02T02_55_04.492502", "path": ["results_2024-02-02T02-55-04.492502.parquet"]}, {"split": "latest", "path": ["results_2024-02-02T02-55-04.492502.parquet"]}]}]}
2024-02-02T02:57:49+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of Gille/StrangeMerges_10-7B-slerp Dataset automatically created during the evaluation run of model Gille/StrangeMerges_10-7B-slerp on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-02-02T02:55:04.492502(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of Gille/StrangeMerges_10-7B-slerp\n\n\n\nDataset automatically created during the evaluation run of model Gille/StrangeMerges_10-7B-slerp on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-02-02T02:55:04.492502(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of Gille/StrangeMerges_10-7B-slerp\n\n\n\nDataset automatically created during the evaluation run of model Gille/StrangeMerges_10-7B-slerp on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-02-02T02:55:04.492502(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
84a62e6352eef00c9baca65d27b8b374aa007086
# Dataset Card for Evaluation run of Gille/StrangeMerges_13-7B-slerp <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [Gille/StrangeMerges_13-7B-slerp](https://huggingface.co/Gille/StrangeMerges_13-7B-slerp) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_Gille__StrangeMerges_13-7B-slerp", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-02-02T03:01:57.627624](https://huggingface.co/datasets/open-llm-leaderboard/details_Gille__StrangeMerges_13-7B-slerp/blob/main/results_2024-02-02T03-01-57.627624.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6490020159727848, "acc_stderr": 0.03207880299561487, "acc_norm": 0.6522498316523601, "acc_norm_stderr": 0.032718201180501595, "mc1": 0.32558139534883723, "mc1_stderr": 0.016403989469907825, "mc2": 0.48620814133438406, "mc2_stderr": 0.014705661692480883 }, "harness|arc:challenge|25": { "acc": 0.6049488054607508, "acc_stderr": 0.01428589829293817, "acc_norm": 0.6382252559726962, "acc_norm_stderr": 0.014041957945038082 }, "harness|hellaswag|10": { "acc": 0.6483768173670583, "acc_stderr": 0.004765012078929384, "acc_norm": 0.8495319657438757, "acc_norm_stderr": 0.0035679889653376993 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.36, "acc_stderr": 0.04824181513244218, "acc_norm": 0.36, "acc_norm_stderr": 0.04824181513244218 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.6148148148148148, "acc_stderr": 0.04203921040156279, "acc_norm": 0.6148148148148148, "acc_norm_stderr": 0.04203921040156279 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.6776315789473685, "acc_stderr": 0.03803510248351585, "acc_norm": 0.6776315789473685, "acc_norm_stderr": 0.03803510248351585 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.59, "acc_stderr": 0.049431107042371025, "acc_norm": 0.59, "acc_norm_stderr": 0.049431107042371025 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.7132075471698113, "acc_stderr": 0.027834912527544067, "acc_norm": 0.7132075471698113, "acc_norm_stderr": 0.027834912527544067 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.7569444444444444, "acc_stderr": 0.035868792800803406, "acc_norm": 0.7569444444444444, "acc_norm_stderr": 0.035868792800803406 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.47, "acc_stderr": 0.05016135580465919, "acc_norm": 0.47, "acc_norm_stderr": 0.05016135580465919 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.54, "acc_stderr": 0.05009082659620332, "acc_norm": 0.54, "acc_norm_stderr": 0.05009082659620332 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.35, "acc_stderr": 0.0479372485441102, "acc_norm": 0.35, "acc_norm_stderr": 0.0479372485441102 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6705202312138728, "acc_stderr": 0.03583901754736412, "acc_norm": 0.6705202312138728, "acc_norm_stderr": 0.03583901754736412 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.38235294117647056, "acc_stderr": 0.04835503696107223, "acc_norm": 0.38235294117647056, "acc_norm_stderr": 0.04835503696107223 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.8, "acc_stderr": 0.04020151261036846, "acc_norm": 0.8, "acc_norm_stderr": 0.04020151261036846 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.5574468085106383, "acc_stderr": 0.03246956919789958, "acc_norm": 0.5574468085106383, "acc_norm_stderr": 0.03246956919789958 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.49122807017543857, "acc_stderr": 0.04702880432049615, "acc_norm": 0.49122807017543857, "acc_norm_stderr": 0.04702880432049615 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5517241379310345, "acc_stderr": 0.04144311810878152, "acc_norm": 0.5517241379310345, "acc_norm_stderr": 0.04144311810878152 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.41798941798941797, "acc_stderr": 0.02540255550326091, "acc_norm": 0.41798941798941797, "acc_norm_stderr": 0.02540255550326091 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.4603174603174603, "acc_stderr": 0.04458029125470973, "acc_norm": 0.4603174603174603, "acc_norm_stderr": 0.04458029125470973 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.38, "acc_stderr": 0.048783173121456316, "acc_norm": 0.38, "acc_norm_stderr": 0.048783173121456316 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.7677419354838709, "acc_stderr": 0.024022256130308235, "acc_norm": 0.7677419354838709, "acc_norm_stderr": 0.024022256130308235 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.5270935960591133, "acc_stderr": 0.03512819077876106, "acc_norm": 0.5270935960591133, "acc_norm_stderr": 0.03512819077876106 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.7, "acc_stderr": 0.046056618647183814, "acc_norm": 0.7, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7757575757575758, "acc_stderr": 0.03256866661681102, "acc_norm": 0.7757575757575758, "acc_norm_stderr": 0.03256866661681102 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.7878787878787878, "acc_stderr": 0.029126522834586808, "acc_norm": 0.7878787878787878, "acc_norm_stderr": 0.029126522834586808 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.8963730569948186, "acc_stderr": 0.02199531196364424, "acc_norm": 0.8963730569948186, "acc_norm_stderr": 0.02199531196364424 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.6692307692307692, "acc_stderr": 0.023854795680971128, "acc_norm": 0.6692307692307692, "acc_norm_stderr": 0.023854795680971128 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.3592592592592593, "acc_stderr": 0.029252905927251976, "acc_norm": 0.3592592592592593, "acc_norm_stderr": 0.029252905927251976 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.6638655462184874, "acc_stderr": 0.03068473711513536, "acc_norm": 0.6638655462184874, "acc_norm_stderr": 0.03068473711513536 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.3509933774834437, "acc_stderr": 0.03896981964257375, "acc_norm": 0.3509933774834437, "acc_norm_stderr": 0.03896981964257375 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8385321100917431, "acc_stderr": 0.015776239256163224, "acc_norm": 0.8385321100917431, "acc_norm_stderr": 0.015776239256163224 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.5277777777777778, "acc_stderr": 0.0340470532865388, "acc_norm": 0.5277777777777778, "acc_norm_stderr": 0.0340470532865388 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.7990196078431373, "acc_stderr": 0.028125972265654373, "acc_norm": 0.7990196078431373, "acc_norm_stderr": 0.028125972265654373 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.7763713080168776, "acc_stderr": 0.027123298205229966, "acc_norm": 0.7763713080168776, "acc_norm_stderr": 0.027123298205229966 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6905829596412556, "acc_stderr": 0.031024411740572213, "acc_norm": 0.6905829596412556, "acc_norm_stderr": 0.031024411740572213 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.816793893129771, "acc_stderr": 0.03392770926494733, "acc_norm": 0.816793893129771, "acc_norm_stderr": 0.03392770926494733 }, "harness|hendrycksTest-international_law|5": { "acc": 0.8016528925619835, "acc_stderr": 0.03640118271990947, "acc_norm": 0.8016528925619835, "acc_norm_stderr": 0.03640118271990947 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7870370370370371, "acc_stderr": 0.0395783547198098, "acc_norm": 0.7870370370370371, "acc_norm_stderr": 0.0395783547198098 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7791411042944786, "acc_stderr": 0.03259177392742178, "acc_norm": 0.7791411042944786, "acc_norm_stderr": 0.03259177392742178 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.5178571428571429, "acc_stderr": 0.047427623612430116, "acc_norm": 0.5178571428571429, "acc_norm_stderr": 0.047427623612430116 }, "harness|hendrycksTest-management|5": { "acc": 0.7961165048543689, "acc_stderr": 0.03989139859531771, "acc_norm": 0.7961165048543689, "acc_norm_stderr": 0.03989139859531771 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8717948717948718, "acc_stderr": 0.02190190511507333, "acc_norm": 0.8717948717948718, "acc_norm_stderr": 0.02190190511507333 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.75, "acc_stderr": 0.04351941398892446, "acc_norm": 0.75, "acc_norm_stderr": 0.04351941398892446 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8186462324393359, "acc_stderr": 0.013778693778464074, "acc_norm": 0.8186462324393359, "acc_norm_stderr": 0.013778693778464074 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.7341040462427746, "acc_stderr": 0.023786203255508283, "acc_norm": 0.7341040462427746, "acc_norm_stderr": 0.023786203255508283 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.34972067039106147, "acc_stderr": 0.015949308790233645, "acc_norm": 0.34972067039106147, "acc_norm_stderr": 0.015949308790233645 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.7483660130718954, "acc_stderr": 0.0248480182638752, "acc_norm": 0.7483660130718954, "acc_norm_stderr": 0.0248480182638752 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.7138263665594855, "acc_stderr": 0.025670259242188933, "acc_norm": 0.7138263665594855, "acc_norm_stderr": 0.025670259242188933 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.7623456790123457, "acc_stderr": 0.023683591837008553, "acc_norm": 0.7623456790123457, "acc_norm_stderr": 0.023683591837008553 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.49645390070921985, "acc_stderr": 0.02982674915328092, "acc_norm": 0.49645390070921985, "acc_norm_stderr": 0.02982674915328092 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.46870925684485004, "acc_stderr": 0.012745204626083135, "acc_norm": 0.46870925684485004, "acc_norm_stderr": 0.012745204626083135 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.6617647058823529, "acc_stderr": 0.028739328513983572, "acc_norm": 0.6617647058823529, "acc_norm_stderr": 0.028739328513983572 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6879084967320261, "acc_stderr": 0.01874501120127766, "acc_norm": 0.6879084967320261, "acc_norm_stderr": 0.01874501120127766 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6454545454545455, "acc_stderr": 0.045820048415054174, "acc_norm": 0.6454545454545455, "acc_norm_stderr": 0.045820048415054174 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.7591836734693878, "acc_stderr": 0.02737294220178816, "acc_norm": 0.7591836734693878, "acc_norm_stderr": 0.02737294220178816 }, "harness|hendrycksTest-sociology|5": { "acc": 0.8606965174129353, "acc_stderr": 0.024484487162913973, "acc_norm": 0.8606965174129353, "acc_norm_stderr": 0.024484487162913973 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.88, "acc_stderr": 0.03265986323710906, "acc_norm": 0.88, "acc_norm_stderr": 0.03265986323710906 }, "harness|hendrycksTest-virology|5": { "acc": 0.5481927710843374, "acc_stderr": 0.03874371556587953, "acc_norm": 0.5481927710843374, "acc_norm_stderr": 0.03874371556587953 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8245614035087719, "acc_stderr": 0.02917088550072766, "acc_norm": 0.8245614035087719, "acc_norm_stderr": 0.02917088550072766 }, "harness|truthfulqa:mc|0": { "mc1": 0.32558139534883723, "mc1_stderr": 0.016403989469907825, "mc2": 0.48620814133438406, "mc2_stderr": 0.014705661692480883 }, "harness|winogrande|5": { "acc": 0.7987371744277821, "acc_stderr": 0.011268519971577682 }, "harness|gsm8k|5": { "acc": 0.5420773313115997, "acc_stderr": 0.013723629649844075 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_Gille__StrangeMerges_13-7B-slerp
[ "region:us" ]
2024-02-02T03:04:21+00:00
{"pretty_name": "Evaluation run of Gille/StrangeMerges_13-7B-slerp", "dataset_summary": "Dataset automatically created during the evaluation run of model [Gille/StrangeMerges_13-7B-slerp](https://huggingface.co/Gille/StrangeMerges_13-7B-slerp) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Gille__StrangeMerges_13-7B-slerp\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-02T03:01:57.627624](https://huggingface.co/datasets/open-llm-leaderboard/details_Gille__StrangeMerges_13-7B-slerp/blob/main/results_2024-02-02T03-01-57.627624.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6490020159727848,\n \"acc_stderr\": 0.03207880299561487,\n \"acc_norm\": 0.6522498316523601,\n \"acc_norm_stderr\": 0.032718201180501595,\n \"mc1\": 0.32558139534883723,\n \"mc1_stderr\": 0.016403989469907825,\n \"mc2\": 0.48620814133438406,\n \"mc2_stderr\": 0.014705661692480883\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6049488054607508,\n \"acc_stderr\": 0.01428589829293817,\n \"acc_norm\": 0.6382252559726962,\n \"acc_norm_stderr\": 0.014041957945038082\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6483768173670583,\n \"acc_stderr\": 0.004765012078929384,\n \"acc_norm\": 0.8495319657438757,\n \"acc_norm_stderr\": 0.0035679889653376993\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6148148148148148,\n \"acc_stderr\": 0.04203921040156279,\n \"acc_norm\": 0.6148148148148148,\n \"acc_norm_stderr\": 0.04203921040156279\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6776315789473685,\n \"acc_stderr\": 0.03803510248351585,\n \"acc_norm\": 0.6776315789473685,\n \"acc_norm_stderr\": 0.03803510248351585\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.59,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.59,\n \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7132075471698113,\n \"acc_stderr\": 0.027834912527544067,\n \"acc_norm\": 0.7132075471698113,\n \"acc_norm_stderr\": 0.027834912527544067\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7569444444444444,\n \"acc_stderr\": 0.035868792800803406,\n \"acc_norm\": 0.7569444444444444,\n \"acc_norm_stderr\": 0.035868792800803406\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6705202312138728,\n \"acc_stderr\": 0.03583901754736412,\n \"acc_norm\": 0.6705202312138728,\n \"acc_norm_stderr\": 0.03583901754736412\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107223,\n \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107223\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.8,\n \"acc_stderr\": 0.04020151261036846,\n \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.04020151261036846\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5574468085106383,\n \"acc_stderr\": 0.03246956919789958,\n \"acc_norm\": 0.5574468085106383,\n \"acc_norm_stderr\": 0.03246956919789958\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.49122807017543857,\n \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.49122807017543857,\n \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5517241379310345,\n \"acc_stderr\": 0.04144311810878152,\n \"acc_norm\": 0.5517241379310345,\n \"acc_norm_stderr\": 0.04144311810878152\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.41798941798941797,\n \"acc_stderr\": 0.02540255550326091,\n \"acc_norm\": 0.41798941798941797,\n \"acc_norm_stderr\": 0.02540255550326091\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4603174603174603,\n \"acc_stderr\": 0.04458029125470973,\n \"acc_norm\": 0.4603174603174603,\n \"acc_norm_stderr\": 0.04458029125470973\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7677419354838709,\n \"acc_stderr\": 0.024022256130308235,\n \"acc_norm\": 0.7677419354838709,\n \"acc_norm_stderr\": 0.024022256130308235\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5270935960591133,\n \"acc_stderr\": 0.03512819077876106,\n \"acc_norm\": 0.5270935960591133,\n \"acc_norm_stderr\": 0.03512819077876106\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.03256866661681102,\n \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.03256866661681102\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.029126522834586808,\n \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.029126522834586808\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8963730569948186,\n \"acc_stderr\": 0.02199531196364424,\n \"acc_norm\": 0.8963730569948186,\n \"acc_norm_stderr\": 0.02199531196364424\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6692307692307692,\n \"acc_stderr\": 0.023854795680971128,\n \"acc_norm\": 0.6692307692307692,\n \"acc_norm_stderr\": 0.023854795680971128\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3592592592592593,\n \"acc_stderr\": 0.029252905927251976,\n \"acc_norm\": 0.3592592592592593,\n \"acc_norm_stderr\": 0.029252905927251976\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6638655462184874,\n \"acc_stderr\": 0.03068473711513536,\n \"acc_norm\": 0.6638655462184874,\n \"acc_norm_stderr\": 0.03068473711513536\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3509933774834437,\n \"acc_stderr\": 0.03896981964257375,\n \"acc_norm\": 0.3509933774834437,\n \"acc_norm_stderr\": 0.03896981964257375\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8385321100917431,\n \"acc_stderr\": 0.015776239256163224,\n \"acc_norm\": 0.8385321100917431,\n \"acc_norm_stderr\": 0.015776239256163224\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5277777777777778,\n \"acc_stderr\": 0.0340470532865388,\n \"acc_norm\": 0.5277777777777778,\n \"acc_norm_stderr\": 0.0340470532865388\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7990196078431373,\n \"acc_stderr\": 0.028125972265654373,\n \"acc_norm\": 0.7990196078431373,\n \"acc_norm_stderr\": 0.028125972265654373\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7763713080168776,\n \"acc_stderr\": 0.027123298205229966,\n \"acc_norm\": 0.7763713080168776,\n \"acc_norm_stderr\": 0.027123298205229966\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n \"acc_stderr\": 0.031024411740572213,\n \"acc_norm\": 0.6905829596412556,\n \"acc_norm_stderr\": 0.031024411740572213\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.816793893129771,\n \"acc_stderr\": 0.03392770926494733,\n \"acc_norm\": 0.816793893129771,\n \"acc_norm_stderr\": 0.03392770926494733\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8016528925619835,\n \"acc_stderr\": 0.03640118271990947,\n \"acc_norm\": 0.8016528925619835,\n \"acc_norm_stderr\": 0.03640118271990947\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n \"acc_stderr\": 0.0395783547198098,\n \"acc_norm\": 0.7870370370370371,\n \"acc_norm_stderr\": 0.0395783547198098\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7791411042944786,\n \"acc_stderr\": 0.03259177392742178,\n \"acc_norm\": 0.7791411042944786,\n \"acc_norm_stderr\": 0.03259177392742178\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5178571428571429,\n \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.5178571428571429,\n \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7961165048543689,\n \"acc_stderr\": 0.03989139859531771,\n \"acc_norm\": 0.7961165048543689,\n \"acc_norm_stderr\": 0.03989139859531771\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8717948717948718,\n \"acc_stderr\": 0.02190190511507333,\n \"acc_norm\": 0.8717948717948718,\n \"acc_norm_stderr\": 0.02190190511507333\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8186462324393359,\n \"acc_stderr\": 0.013778693778464074,\n \"acc_norm\": 0.8186462324393359,\n \"acc_norm_stderr\": 0.013778693778464074\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7341040462427746,\n \"acc_stderr\": 0.023786203255508283,\n \"acc_norm\": 0.7341040462427746,\n \"acc_norm_stderr\": 0.023786203255508283\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.34972067039106147,\n \"acc_stderr\": 0.015949308790233645,\n \"acc_norm\": 0.34972067039106147,\n \"acc_norm_stderr\": 0.015949308790233645\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7483660130718954,\n \"acc_stderr\": 0.0248480182638752,\n \"acc_norm\": 0.7483660130718954,\n \"acc_norm_stderr\": 0.0248480182638752\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7138263665594855,\n \"acc_stderr\": 0.025670259242188933,\n \"acc_norm\": 0.7138263665594855,\n \"acc_norm_stderr\": 0.025670259242188933\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7623456790123457,\n \"acc_stderr\": 0.023683591837008553,\n \"acc_norm\": 0.7623456790123457,\n \"acc_norm_stderr\": 0.023683591837008553\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.49645390070921985,\n \"acc_stderr\": 0.02982674915328092,\n \"acc_norm\": 0.49645390070921985,\n \"acc_norm_stderr\": 0.02982674915328092\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46870925684485004,\n \"acc_stderr\": 0.012745204626083135,\n \"acc_norm\": 0.46870925684485004,\n \"acc_norm_stderr\": 0.012745204626083135\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6617647058823529,\n \"acc_stderr\": 0.028739328513983572,\n \"acc_norm\": 0.6617647058823529,\n \"acc_norm_stderr\": 0.028739328513983572\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6879084967320261,\n \"acc_stderr\": 0.01874501120127766,\n \"acc_norm\": 0.6879084967320261,\n \"acc_norm_stderr\": 0.01874501120127766\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6454545454545455,\n \"acc_stderr\": 0.045820048415054174,\n \"acc_norm\": 0.6454545454545455,\n \"acc_norm_stderr\": 0.045820048415054174\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7591836734693878,\n \"acc_stderr\": 0.02737294220178816,\n \"acc_norm\": 0.7591836734693878,\n \"acc_norm_stderr\": 0.02737294220178816\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8606965174129353,\n \"acc_stderr\": 0.024484487162913973,\n \"acc_norm\": 0.8606965174129353,\n \"acc_norm_stderr\": 0.024484487162913973\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.88,\n \"acc_stderr\": 0.03265986323710906,\n \"acc_norm\": 0.88,\n \"acc_norm_stderr\": 0.03265986323710906\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5481927710843374,\n \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.5481927710843374,\n \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8245614035087719,\n \"acc_stderr\": 0.02917088550072766,\n \"acc_norm\": 0.8245614035087719,\n \"acc_norm_stderr\": 0.02917088550072766\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.32558139534883723,\n \"mc1_stderr\": 0.016403989469907825,\n \"mc2\": 0.48620814133438406,\n \"mc2_stderr\": 0.014705661692480883\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7987371744277821,\n \"acc_stderr\": 0.011268519971577682\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5420773313115997,\n \"acc_stderr\": 0.013723629649844075\n }\n}\n```", "repo_url": "https://huggingface.co/Gille/StrangeMerges_13-7B-slerp", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_02T03_01_57.627624", "path": ["**/details_harness|arc:challenge|25_2024-02-02T03-01-57.627624.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-02T03-01-57.627624.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_02T03_01_57.627624", "path": ["**/details_harness|gsm8k|5_2024-02-02T03-01-57.627624.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-02T03-01-57.627624.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_02T03_01_57.627624", "path": ["**/details_harness|hellaswag|10_2024-02-02T03-01-57.627624.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-02T03-01-57.627624.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_02T03_01_57.627624", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T03-01-57.627624.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-02T03-01-57.627624.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-02T03-01-57.627624.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T03-01-57.627624.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T03-01-57.627624.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-02T03-01-57.627624.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T03-01-57.627624.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T03-01-57.627624.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T03-01-57.627624.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T03-01-57.627624.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-02T03-01-57.627624.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-02T03-01-57.627624.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T03-01-57.627624.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-02T03-01-57.627624.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T03-01-57.627624.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T03-01-57.627624.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T03-01-57.627624.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-02T03-01-57.627624.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T03-01-57.627624.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T03-01-57.627624.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T03-01-57.627624.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T03-01-57.627624.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T03-01-57.627624.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T03-01-57.627624.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T03-01-57.627624.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T03-01-57.627624.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T03-01-57.627624.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T03-01-57.627624.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T03-01-57.627624.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T03-01-57.627624.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T03-01-57.627624.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T03-01-57.627624.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-02T03-01-57.627624.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T03-01-57.627624.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-02T03-01-57.627624.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T03-01-57.627624.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T03-01-57.627624.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T03-01-57.627624.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-02T03-01-57.627624.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-02T03-01-57.627624.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T03-01-57.627624.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T03-01-57.627624.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T03-01-57.627624.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T03-01-57.627624.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-02T03-01-57.627624.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-02T03-01-57.627624.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-02T03-01-57.627624.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T03-01-57.627624.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-02T03-01-57.627624.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T03-01-57.627624.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T03-01-57.627624.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-02T03-01-57.627624.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-02T03-01-57.627624.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-02T03-01-57.627624.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T03-01-57.627624.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-02T03-01-57.627624.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-02T03-01-57.627624.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T03-01-57.627624.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-02T03-01-57.627624.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-02T03-01-57.627624.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T03-01-57.627624.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T03-01-57.627624.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-02T03-01-57.627624.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T03-01-57.627624.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T03-01-57.627624.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T03-01-57.627624.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T03-01-57.627624.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-02T03-01-57.627624.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-02T03-01-57.627624.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T03-01-57.627624.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-02T03-01-57.627624.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T03-01-57.627624.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T03-01-57.627624.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T03-01-57.627624.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-02T03-01-57.627624.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T03-01-57.627624.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T03-01-57.627624.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T03-01-57.627624.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T03-01-57.627624.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T03-01-57.627624.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T03-01-57.627624.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T03-01-57.627624.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T03-01-57.627624.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T03-01-57.627624.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T03-01-57.627624.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T03-01-57.627624.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T03-01-57.627624.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T03-01-57.627624.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T03-01-57.627624.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-02T03-01-57.627624.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T03-01-57.627624.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-02T03-01-57.627624.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T03-01-57.627624.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T03-01-57.627624.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T03-01-57.627624.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-02T03-01-57.627624.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-02T03-01-57.627624.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T03-01-57.627624.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T03-01-57.627624.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T03-01-57.627624.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T03-01-57.627624.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-02T03-01-57.627624.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-02T03-01-57.627624.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-02T03-01-57.627624.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T03-01-57.627624.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-02T03-01-57.627624.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T03-01-57.627624.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T03-01-57.627624.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-02T03-01-57.627624.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-02T03-01-57.627624.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-02T03-01-57.627624.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T03-01-57.627624.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-02T03-01-57.627624.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-02T03-01-57.627624.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_02T03_01_57.627624", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T03-01-57.627624.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T03-01-57.627624.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_02T03_01_57.627624", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-02T03-01-57.627624.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-02T03-01-57.627624.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_02T03_01_57.627624", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-02T03-01-57.627624.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-02T03-01-57.627624.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_02T03_01_57.627624", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T03-01-57.627624.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T03-01-57.627624.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_02T03_01_57.627624", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T03-01-57.627624.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T03-01-57.627624.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_02T03_01_57.627624", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-02T03-01-57.627624.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-02T03-01-57.627624.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_02T03_01_57.627624", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T03-01-57.627624.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T03-01-57.627624.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_02T03_01_57.627624", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T03-01-57.627624.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T03-01-57.627624.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_02T03_01_57.627624", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T03-01-57.627624.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T03-01-57.627624.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_02T03_01_57.627624", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T03-01-57.627624.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T03-01-57.627624.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_02T03_01_57.627624", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-02T03-01-57.627624.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-02T03-01-57.627624.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_02T03_01_57.627624", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-02T03-01-57.627624.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-02T03-01-57.627624.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_02T03_01_57.627624", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T03-01-57.627624.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T03-01-57.627624.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_02T03_01_57.627624", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-02T03-01-57.627624.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-02T03-01-57.627624.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_02T03_01_57.627624", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T03-01-57.627624.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T03-01-57.627624.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_02T03_01_57.627624", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T03-01-57.627624.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T03-01-57.627624.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_02T03_01_57.627624", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T03-01-57.627624.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T03-01-57.627624.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_02T03_01_57.627624", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-02T03-01-57.627624.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-02T03-01-57.627624.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_02T03_01_57.627624", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T03-01-57.627624.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T03-01-57.627624.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_02T03_01_57.627624", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T03-01-57.627624.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T03-01-57.627624.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_02T03_01_57.627624", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T03-01-57.627624.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T03-01-57.627624.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_02T03_01_57.627624", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T03-01-57.627624.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T03-01-57.627624.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_02T03_01_57.627624", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T03-01-57.627624.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T03-01-57.627624.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_02T03_01_57.627624", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T03-01-57.627624.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T03-01-57.627624.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_02T03_01_57.627624", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T03-01-57.627624.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T03-01-57.627624.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_02T03_01_57.627624", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T03-01-57.627624.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T03-01-57.627624.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_02T03_01_57.627624", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T03-01-57.627624.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T03-01-57.627624.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_02T03_01_57.627624", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T03-01-57.627624.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T03-01-57.627624.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_02T03_01_57.627624", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T03-01-57.627624.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T03-01-57.627624.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_02T03_01_57.627624", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T03-01-57.627624.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T03-01-57.627624.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_02T03_01_57.627624", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T03-01-57.627624.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T03-01-57.627624.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_02T03_01_57.627624", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T03-01-57.627624.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T03-01-57.627624.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_02T03_01_57.627624", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-02T03-01-57.627624.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-02T03-01-57.627624.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_02T03_01_57.627624", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T03-01-57.627624.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T03-01-57.627624.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_02T03_01_57.627624", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-02T03-01-57.627624.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-02T03-01-57.627624.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_02T03_01_57.627624", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T03-01-57.627624.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T03-01-57.627624.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_02T03_01_57.627624", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T03-01-57.627624.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T03-01-57.627624.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_02T03_01_57.627624", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T03-01-57.627624.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T03-01-57.627624.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_02T03_01_57.627624", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-02T03-01-57.627624.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-02T03-01-57.627624.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_02T03_01_57.627624", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-02T03-01-57.627624.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-02T03-01-57.627624.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_02T03_01_57.627624", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T03-01-57.627624.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T03-01-57.627624.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_02T03_01_57.627624", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T03-01-57.627624.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T03-01-57.627624.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_02T03_01_57.627624", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T03-01-57.627624.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T03-01-57.627624.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_02T03_01_57.627624", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T03-01-57.627624.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T03-01-57.627624.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_02T03_01_57.627624", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-02T03-01-57.627624.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-02T03-01-57.627624.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_02T03_01_57.627624", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-02T03-01-57.627624.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-02T03-01-57.627624.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_02T03_01_57.627624", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-02T03-01-57.627624.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-02T03-01-57.627624.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_02T03_01_57.627624", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T03-01-57.627624.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T03-01-57.627624.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_02T03_01_57.627624", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-02T03-01-57.627624.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-02T03-01-57.627624.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_02T03_01_57.627624", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T03-01-57.627624.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T03-01-57.627624.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_02T03_01_57.627624", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T03-01-57.627624.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T03-01-57.627624.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_02T03_01_57.627624", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-02T03-01-57.627624.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-02T03-01-57.627624.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_02T03_01_57.627624", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-02T03-01-57.627624.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-02T03-01-57.627624.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_02T03_01_57.627624", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-02T03-01-57.627624.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-02T03-01-57.627624.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_02T03_01_57.627624", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T03-01-57.627624.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T03-01-57.627624.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_02T03_01_57.627624", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-02T03-01-57.627624.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-02T03-01-57.627624.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_02T03_01_57.627624", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-02T03-01-57.627624.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-02T03-01-57.627624.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_02T03_01_57.627624", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-02T03-01-57.627624.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-02T03-01-57.627624.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_02T03_01_57.627624", "path": ["**/details_harness|winogrande|5_2024-02-02T03-01-57.627624.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-02T03-01-57.627624.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_02T03_01_57.627624", "path": ["results_2024-02-02T03-01-57.627624.parquet"]}, {"split": "latest", "path": ["results_2024-02-02T03-01-57.627624.parquet"]}]}]}
2024-02-02T03:04:49+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of Gille/StrangeMerges_13-7B-slerp Dataset automatically created during the evaluation run of model Gille/StrangeMerges_13-7B-slerp on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-02-02T03:01:57.627624(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of Gille/StrangeMerges_13-7B-slerp\n\n\n\nDataset automatically created during the evaluation run of model Gille/StrangeMerges_13-7B-slerp on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-02-02T03:01:57.627624(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of Gille/StrangeMerges_13-7B-slerp\n\n\n\nDataset automatically created during the evaluation run of model Gille/StrangeMerges_13-7B-slerp on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-02-02T03:01:57.627624(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
a399f456222756baa2c742a8f90ab7b9edcd6faf
# Summary Training and Validation Data for Shogi AI Development # Contents - shuffled.7z.00? ... Training Data - shuffled.bin ... Validation Data The training and validation data are in the YaneuraOu PackedSfenValue format. Both datasets were generated using Suisho5 with a search depth of 9. The training and validation data have already been shuffled. Positions within these datasets have been replaced with the PV (Principal Variation) leaf node from the quiescence search of the original position. Developers using this data should note that it is not necessary to perform a quiescence search on these positions to obtain the PV leaf node. # Links - nodchip/tanuki-: shogi engine(AI player), stronger than Bonanza6 , educational and tiny code(about 2500 lines) , USI compliant engine , capable of being compiled by VC++2015 https://github.com/nodchip/tanuki-
nodchip/shogi_suisho5_depth9
[ "license:mit", "region:us" ]
2024-02-02T03:05:33+00:00
{"license": "mit"}
2024-02-02T05:33:10+00:00
[]
[]
TAGS #license-mit #region-us
# Summary Training and Validation Data for Shogi AI Development # Contents - shuffled.7z.00? ... Training Data - URL ... Validation Data The training and validation data are in the YaneuraOu PackedSfenValue format. Both datasets were generated using Suisho5 with a search depth of 9. The training and validation data have already been shuffled. Positions within these datasets have been replaced with the PV (Principal Variation) leaf node from the quiescence search of the original position. Developers using this data should note that it is not necessary to perform a quiescence search on these positions to obtain the PV leaf node. # Links - nodchip/tanuki-: shogi engine(AI player), stronger than Bonanza6 , educational and tiny code(about 2500 lines) , USI compliant engine , capable of being compiled by VC++2015 URL
[ "# Summary\nTraining and Validation Data for Shogi AI Development", "# Contents\n- shuffled.7z.00? ... Training Data\n- URL ... Validation Data\n\nThe training and validation data are in the YaneuraOu PackedSfenValue format.\n\nBoth datasets were generated using Suisho5 with a search depth of 9.\n\nThe training and validation data have already been shuffled.\nPositions within these datasets have been replaced with the PV (Principal Variation) leaf node from the quiescence search of the original position.\nDevelopers using this data should note that it is not necessary to perform a quiescence search on these positions to obtain the PV leaf node.", "# Links\n- nodchip/tanuki-: shogi engine(AI player), stronger than Bonanza6 , educational and tiny code(about 2500 lines) , USI compliant engine , capable of being compiled by VC++2015 URL" ]
[ "TAGS\n#license-mit #region-us \n", "# Summary\nTraining and Validation Data for Shogi AI Development", "# Contents\n- shuffled.7z.00? ... Training Data\n- URL ... Validation Data\n\nThe training and validation data are in the YaneuraOu PackedSfenValue format.\n\nBoth datasets were generated using Suisho5 with a search depth of 9.\n\nThe training and validation data have already been shuffled.\nPositions within these datasets have been replaced with the PV (Principal Variation) leaf node from the quiescence search of the original position.\nDevelopers using this data should note that it is not necessary to perform a quiescence search on these positions to obtain the PV leaf node.", "# Links\n- nodchip/tanuki-: shogi engine(AI player), stronger than Bonanza6 , educational and tiny code(about 2500 lines) , USI compliant engine , capable of being compiled by VC++2015 URL" ]
f169610f0508d3d38a9c283fe1fa93bd6c3ef03f
# Dataset Card for Evaluation run of Gille/StrangeMerges_16-7B-slerp <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [Gille/StrangeMerges_16-7B-slerp](https://huggingface.co/Gille/StrangeMerges_16-7B-slerp) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_Gille__StrangeMerges_16-7B-slerp", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-02-02T03:08:22.269991](https://huggingface.co/datasets/open-llm-leaderboard/details_Gille__StrangeMerges_16-7B-slerp/blob/main/results_2024-02-02T03-08-22.269991.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6605358074265332, "acc_stderr": 0.03176350775454718, "acc_norm": 0.6607069804303847, "acc_norm_stderr": 0.032415467522633835, "mc1": 0.4565483476132191, "mc1_stderr": 0.01743728095318369, "mc2": 0.629677373384675, "mc2_stderr": 0.01522731253886815 }, "harness|arc:challenge|25": { "acc": 0.6646757679180887, "acc_stderr": 0.01379618294778556, "acc_norm": 0.6902730375426621, "acc_norm_stderr": 0.013512058415238363 }, "harness|hellaswag|10": { "acc": 0.6878111929894444, "acc_stderr": 0.0046243936909669036, "acc_norm": 0.871539533957379, "acc_norm_stderr": 0.0033391798350182857 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.36, "acc_stderr": 0.04824181513244218, "acc_norm": 0.36, "acc_norm_stderr": 0.04824181513244218 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.6666666666666666, "acc_stderr": 0.04072314811876837, "acc_norm": 0.6666666666666666, "acc_norm_stderr": 0.04072314811876837 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.6973684210526315, "acc_stderr": 0.0373852067611967, "acc_norm": 0.6973684210526315, "acc_norm_stderr": 0.0373852067611967 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.63, "acc_stderr": 0.048523658709391, "acc_norm": 0.63, "acc_norm_stderr": 0.048523658709391 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.7132075471698113, "acc_stderr": 0.027834912527544064, "acc_norm": 0.7132075471698113, "acc_norm_stderr": 0.027834912527544064 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.7777777777777778, "acc_stderr": 0.03476590104304134, "acc_norm": 0.7777777777777778, "acc_norm_stderr": 0.03476590104304134 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.48, "acc_stderr": 0.050211673156867795, "acc_norm": 0.48, "acc_norm_stderr": 0.050211673156867795 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.54, "acc_stderr": 0.05009082659620333, "acc_norm": 0.54, "acc_norm_stderr": 0.05009082659620333 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.3, "acc_stderr": 0.046056618647183814, "acc_norm": 0.3, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6878612716763006, "acc_stderr": 0.03533133389323657, "acc_norm": 0.6878612716763006, "acc_norm_stderr": 0.03533133389323657 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.4411764705882353, "acc_stderr": 0.049406356306056595, "acc_norm": 0.4411764705882353, "acc_norm_stderr": 0.049406356306056595 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.78, "acc_stderr": 0.04163331998932263, "acc_norm": 0.78, "acc_norm_stderr": 0.04163331998932263 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.6042553191489362, "acc_stderr": 0.031967586978353627, "acc_norm": 0.6042553191489362, "acc_norm_stderr": 0.031967586978353627 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.5, "acc_stderr": 0.047036043419179864, "acc_norm": 0.5, "acc_norm_stderr": 0.047036043419179864 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5724137931034483, "acc_stderr": 0.04122737111370332, "acc_norm": 0.5724137931034483, "acc_norm_stderr": 0.04122737111370332 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.4074074074074074, "acc_stderr": 0.02530590624159063, "acc_norm": 0.4074074074074074, "acc_norm_stderr": 0.02530590624159063 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.5, "acc_stderr": 0.04472135954999579, "acc_norm": 0.5, "acc_norm_stderr": 0.04472135954999579 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.32, "acc_stderr": 0.046882617226215034, "acc_norm": 0.32, "acc_norm_stderr": 0.046882617226215034 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.7967741935483871, "acc_stderr": 0.022891687984554963, "acc_norm": 0.7967741935483871, "acc_norm_stderr": 0.022891687984554963 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.4975369458128079, "acc_stderr": 0.03517945038691063, "acc_norm": 0.4975369458128079, "acc_norm_stderr": 0.03517945038691063 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.7, "acc_stderr": 0.046056618647183814, "acc_norm": 0.7, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7818181818181819, "acc_stderr": 0.03225078108306289, "acc_norm": 0.7818181818181819, "acc_norm_stderr": 0.03225078108306289 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.7828282828282829, "acc_stderr": 0.02937661648494563, "acc_norm": 0.7828282828282829, "acc_norm_stderr": 0.02937661648494563 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.8963730569948186, "acc_stderr": 0.02199531196364424, "acc_norm": 0.8963730569948186, "acc_norm_stderr": 0.02199531196364424 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.6717948717948717, "acc_stderr": 0.023807633198657266, "acc_norm": 0.6717948717948717, "acc_norm_stderr": 0.023807633198657266 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.35185185185185186, "acc_stderr": 0.02911661760608301, "acc_norm": 0.35185185185185186, "acc_norm_stderr": 0.02911661760608301 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.7100840336134454, "acc_stderr": 0.029472485833136098, "acc_norm": 0.7100840336134454, "acc_norm_stderr": 0.029472485833136098 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.32450331125827814, "acc_stderr": 0.03822746937658752, "acc_norm": 0.32450331125827814, "acc_norm_stderr": 0.03822746937658752 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8513761467889909, "acc_stderr": 0.015251253773660834, "acc_norm": 0.8513761467889909, "acc_norm_stderr": 0.015251253773660834 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.5324074074074074, "acc_stderr": 0.03402801581358966, "acc_norm": 0.5324074074074074, "acc_norm_stderr": 0.03402801581358966 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.8382352941176471, "acc_stderr": 0.025845017986926917, "acc_norm": 0.8382352941176471, "acc_norm_stderr": 0.025845017986926917 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.8185654008438819, "acc_stderr": 0.02508596114457966, "acc_norm": 0.8185654008438819, "acc_norm_stderr": 0.02508596114457966 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6995515695067265, "acc_stderr": 0.030769352008229143, "acc_norm": 0.6995515695067265, "acc_norm_stderr": 0.030769352008229143 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.8015267175572519, "acc_stderr": 0.034981493854624734, "acc_norm": 0.8015267175572519, "acc_norm_stderr": 0.034981493854624734 }, "harness|hendrycksTest-international_law|5": { "acc": 0.8016528925619835, "acc_stderr": 0.03640118271990946, "acc_norm": 0.8016528925619835, "acc_norm_stderr": 0.03640118271990946 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.8148148148148148, "acc_stderr": 0.03755265865037181, "acc_norm": 0.8148148148148148, "acc_norm_stderr": 0.03755265865037181 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7730061349693251, "acc_stderr": 0.03291099578615769, "acc_norm": 0.7730061349693251, "acc_norm_stderr": 0.03291099578615769 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.49107142857142855, "acc_stderr": 0.04745033255489123, "acc_norm": 0.49107142857142855, "acc_norm_stderr": 0.04745033255489123 }, "harness|hendrycksTest-management|5": { "acc": 0.8058252427184466, "acc_stderr": 0.03916667762822584, "acc_norm": 0.8058252427184466, "acc_norm_stderr": 0.03916667762822584 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8803418803418803, "acc_stderr": 0.021262719400406957, "acc_norm": 0.8803418803418803, "acc_norm_stderr": 0.021262719400406957 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.73, "acc_stderr": 0.0446196043338474, "acc_norm": 0.73, "acc_norm_stderr": 0.0446196043338474 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8378033205619413, "acc_stderr": 0.013182222616720885, "acc_norm": 0.8378033205619413, "acc_norm_stderr": 0.013182222616720885 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.7572254335260116, "acc_stderr": 0.023083658586984204, "acc_norm": 0.7572254335260116, "acc_norm_stderr": 0.023083658586984204 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.40893854748603353, "acc_stderr": 0.016442830654715544, "acc_norm": 0.40893854748603353, "acc_norm_stderr": 0.016442830654715544 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.7254901960784313, "acc_stderr": 0.02555316999182652, "acc_norm": 0.7254901960784313, "acc_norm_stderr": 0.02555316999182652 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.7202572347266881, "acc_stderr": 0.02549425935069491, "acc_norm": 0.7202572347266881, "acc_norm_stderr": 0.02549425935069491 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.7746913580246914, "acc_stderr": 0.02324620264781975, "acc_norm": 0.7746913580246914, "acc_norm_stderr": 0.02324620264781975 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.4858156028368794, "acc_stderr": 0.02981549448368206, "acc_norm": 0.4858156028368794, "acc_norm_stderr": 0.02981549448368206 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.4791395045632334, "acc_stderr": 0.012759117066518015, "acc_norm": 0.4791395045632334, "acc_norm_stderr": 0.012759117066518015 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.6948529411764706, "acc_stderr": 0.027971541370170598, "acc_norm": 0.6948529411764706, "acc_norm_stderr": 0.027971541370170598 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6715686274509803, "acc_stderr": 0.01899970738316267, "acc_norm": 0.6715686274509803, "acc_norm_stderr": 0.01899970738316267 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.7, "acc_stderr": 0.04389311454644287, "acc_norm": 0.7, "acc_norm_stderr": 0.04389311454644287 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.7510204081632653, "acc_stderr": 0.027682979522960234, "acc_norm": 0.7510204081632653, "acc_norm_stderr": 0.027682979522960234 }, "harness|hendrycksTest-sociology|5": { "acc": 0.8407960199004975, "acc_stderr": 0.02587064676616913, "acc_norm": 0.8407960199004975, "acc_norm_stderr": 0.02587064676616913 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.87, "acc_stderr": 0.033799766898963086, "acc_norm": 0.87, "acc_norm_stderr": 0.033799766898963086 }, "harness|hendrycksTest-virology|5": { "acc": 0.5301204819277109, "acc_stderr": 0.03885425420866767, "acc_norm": 0.5301204819277109, "acc_norm_stderr": 0.03885425420866767 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8421052631578947, "acc_stderr": 0.02796678585916089, "acc_norm": 0.8421052631578947, "acc_norm_stderr": 0.02796678585916089 }, "harness|truthfulqa:mc|0": { "mc1": 0.4565483476132191, "mc1_stderr": 0.01743728095318369, "mc2": 0.629677373384675, "mc2_stderr": 0.01522731253886815 }, "harness|winogrande|5": { "acc": 0.8129439621152328, "acc_stderr": 0.010959716435242914 }, "harness|gsm8k|5": { "acc": 0.7073540561031084, "acc_stderr": 0.012532334368242885 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_Gille__StrangeMerges_16-7B-slerp
[ "region:us" ]
2024-02-02T03:10:39+00:00
{"pretty_name": "Evaluation run of Gille/StrangeMerges_16-7B-slerp", "dataset_summary": "Dataset automatically created during the evaluation run of model [Gille/StrangeMerges_16-7B-slerp](https://huggingface.co/Gille/StrangeMerges_16-7B-slerp) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Gille__StrangeMerges_16-7B-slerp\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-02T03:08:22.269991](https://huggingface.co/datasets/open-llm-leaderboard/details_Gille__StrangeMerges_16-7B-slerp/blob/main/results_2024-02-02T03-08-22.269991.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6605358074265332,\n \"acc_stderr\": 0.03176350775454718,\n \"acc_norm\": 0.6607069804303847,\n \"acc_norm_stderr\": 0.032415467522633835,\n \"mc1\": 0.4565483476132191,\n \"mc1_stderr\": 0.01743728095318369,\n \"mc2\": 0.629677373384675,\n \"mc2_stderr\": 0.01522731253886815\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6646757679180887,\n \"acc_stderr\": 0.01379618294778556,\n \"acc_norm\": 0.6902730375426621,\n \"acc_norm_stderr\": 0.013512058415238363\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6878111929894444,\n \"acc_stderr\": 0.0046243936909669036,\n \"acc_norm\": 0.871539533957379,\n \"acc_norm_stderr\": 0.0033391798350182857\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.04072314811876837,\n \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.04072314811876837\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6973684210526315,\n \"acc_stderr\": 0.0373852067611967,\n \"acc_norm\": 0.6973684210526315,\n \"acc_norm_stderr\": 0.0373852067611967\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.63,\n \"acc_stderr\": 0.048523658709391,\n \"acc_norm\": 0.63,\n \"acc_norm_stderr\": 0.048523658709391\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7132075471698113,\n \"acc_stderr\": 0.027834912527544064,\n \"acc_norm\": 0.7132075471698113,\n \"acc_norm_stderr\": 0.027834912527544064\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.03476590104304134,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.03476590104304134\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6878612716763006,\n \"acc_stderr\": 0.03533133389323657,\n \"acc_norm\": 0.6878612716763006,\n \"acc_norm_stderr\": 0.03533133389323657\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4411764705882353,\n \"acc_stderr\": 0.049406356306056595,\n \"acc_norm\": 0.4411764705882353,\n \"acc_norm_stderr\": 0.049406356306056595\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.78,\n \"acc_stderr\": 0.04163331998932263,\n \"acc_norm\": 0.78,\n \"acc_norm_stderr\": 0.04163331998932263\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.6042553191489362,\n \"acc_stderr\": 0.031967586978353627,\n \"acc_norm\": 0.6042553191489362,\n \"acc_norm_stderr\": 0.031967586978353627\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.047036043419179864,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.047036043419179864\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5724137931034483,\n \"acc_stderr\": 0.04122737111370332,\n \"acc_norm\": 0.5724137931034483,\n \"acc_norm_stderr\": 0.04122737111370332\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.4074074074074074,\n \"acc_stderr\": 0.02530590624159063,\n \"acc_norm\": 0.4074074074074074,\n \"acc_norm_stderr\": 0.02530590624159063\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.04472135954999579,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.04472135954999579\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7967741935483871,\n \"acc_stderr\": 0.022891687984554963,\n \"acc_norm\": 0.7967741935483871,\n \"acc_norm_stderr\": 0.022891687984554963\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4975369458128079,\n \"acc_stderr\": 0.03517945038691063,\n \"acc_norm\": 0.4975369458128079,\n \"acc_norm_stderr\": 0.03517945038691063\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7818181818181819,\n \"acc_stderr\": 0.03225078108306289,\n \"acc_norm\": 0.7818181818181819,\n \"acc_norm_stderr\": 0.03225078108306289\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7828282828282829,\n \"acc_stderr\": 0.02937661648494563,\n \"acc_norm\": 0.7828282828282829,\n \"acc_norm_stderr\": 0.02937661648494563\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8963730569948186,\n \"acc_stderr\": 0.02199531196364424,\n \"acc_norm\": 0.8963730569948186,\n \"acc_norm_stderr\": 0.02199531196364424\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6717948717948717,\n \"acc_stderr\": 0.023807633198657266,\n \"acc_norm\": 0.6717948717948717,\n \"acc_norm_stderr\": 0.023807633198657266\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.35185185185185186,\n \"acc_stderr\": 0.02911661760608301,\n \"acc_norm\": 0.35185185185185186,\n \"acc_norm_stderr\": 0.02911661760608301\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.7100840336134454,\n \"acc_stderr\": 0.029472485833136098,\n \"acc_norm\": 0.7100840336134454,\n \"acc_norm_stderr\": 0.029472485833136098\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.32450331125827814,\n \"acc_stderr\": 0.03822746937658752,\n \"acc_norm\": 0.32450331125827814,\n \"acc_norm_stderr\": 0.03822746937658752\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8513761467889909,\n \"acc_stderr\": 0.015251253773660834,\n \"acc_norm\": 0.8513761467889909,\n \"acc_norm_stderr\": 0.015251253773660834\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5324074074074074,\n \"acc_stderr\": 0.03402801581358966,\n \"acc_norm\": 0.5324074074074074,\n \"acc_norm_stderr\": 0.03402801581358966\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8382352941176471,\n \"acc_stderr\": 0.025845017986926917,\n \"acc_norm\": 0.8382352941176471,\n \"acc_norm_stderr\": 0.025845017986926917\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8185654008438819,\n \"acc_stderr\": 0.02508596114457966,\n \"acc_norm\": 0.8185654008438819,\n \"acc_norm_stderr\": 0.02508596114457966\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6995515695067265,\n \"acc_stderr\": 0.030769352008229143,\n \"acc_norm\": 0.6995515695067265,\n \"acc_norm_stderr\": 0.030769352008229143\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8015267175572519,\n \"acc_stderr\": 0.034981493854624734,\n \"acc_norm\": 0.8015267175572519,\n \"acc_norm_stderr\": 0.034981493854624734\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8016528925619835,\n \"acc_stderr\": 0.03640118271990946,\n \"acc_norm\": 0.8016528925619835,\n \"acc_norm_stderr\": 0.03640118271990946\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8148148148148148,\n \"acc_stderr\": 0.03755265865037181,\n \"acc_norm\": 0.8148148148148148,\n \"acc_norm_stderr\": 0.03755265865037181\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7730061349693251,\n \"acc_stderr\": 0.03291099578615769,\n \"acc_norm\": 0.7730061349693251,\n \"acc_norm_stderr\": 0.03291099578615769\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.49107142857142855,\n \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.49107142857142855,\n \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8058252427184466,\n \"acc_stderr\": 0.03916667762822584,\n \"acc_norm\": 0.8058252427184466,\n \"acc_norm_stderr\": 0.03916667762822584\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n \"acc_stderr\": 0.021262719400406957,\n \"acc_norm\": 0.8803418803418803,\n \"acc_norm_stderr\": 0.021262719400406957\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.73,\n \"acc_stderr\": 0.0446196043338474,\n \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8378033205619413,\n \"acc_stderr\": 0.013182222616720885,\n \"acc_norm\": 0.8378033205619413,\n \"acc_norm_stderr\": 0.013182222616720885\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7572254335260116,\n \"acc_stderr\": 0.023083658586984204,\n \"acc_norm\": 0.7572254335260116,\n \"acc_norm_stderr\": 0.023083658586984204\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.40893854748603353,\n \"acc_stderr\": 0.016442830654715544,\n \"acc_norm\": 0.40893854748603353,\n \"acc_norm_stderr\": 0.016442830654715544\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7254901960784313,\n \"acc_stderr\": 0.02555316999182652,\n \"acc_norm\": 0.7254901960784313,\n \"acc_norm_stderr\": 0.02555316999182652\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7202572347266881,\n \"acc_stderr\": 0.02549425935069491,\n \"acc_norm\": 0.7202572347266881,\n \"acc_norm_stderr\": 0.02549425935069491\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7746913580246914,\n \"acc_stderr\": 0.02324620264781975,\n \"acc_norm\": 0.7746913580246914,\n \"acc_norm_stderr\": 0.02324620264781975\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4858156028368794,\n \"acc_stderr\": 0.02981549448368206,\n \"acc_norm\": 0.4858156028368794,\n \"acc_norm_stderr\": 0.02981549448368206\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4791395045632334,\n \"acc_stderr\": 0.012759117066518015,\n \"acc_norm\": 0.4791395045632334,\n \"acc_norm_stderr\": 0.012759117066518015\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6948529411764706,\n \"acc_stderr\": 0.027971541370170598,\n \"acc_norm\": 0.6948529411764706,\n \"acc_norm_stderr\": 0.027971541370170598\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6715686274509803,\n \"acc_stderr\": 0.01899970738316267,\n \"acc_norm\": 0.6715686274509803,\n \"acc_norm_stderr\": 0.01899970738316267\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.04389311454644287,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.04389311454644287\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7510204081632653,\n \"acc_stderr\": 0.027682979522960234,\n \"acc_norm\": 0.7510204081632653,\n \"acc_norm_stderr\": 0.027682979522960234\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n \"acc_stderr\": 0.02587064676616913,\n \"acc_norm\": 0.8407960199004975,\n \"acc_norm_stderr\": 0.02587064676616913\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.87,\n \"acc_stderr\": 0.033799766898963086,\n \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.033799766898963086\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5301204819277109,\n \"acc_stderr\": 0.03885425420866767,\n \"acc_norm\": 0.5301204819277109,\n \"acc_norm_stderr\": 0.03885425420866767\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8421052631578947,\n \"acc_stderr\": 0.02796678585916089,\n \"acc_norm\": 0.8421052631578947,\n \"acc_norm_stderr\": 0.02796678585916089\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4565483476132191,\n \"mc1_stderr\": 0.01743728095318369,\n \"mc2\": 0.629677373384675,\n \"mc2_stderr\": 0.01522731253886815\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8129439621152328,\n \"acc_stderr\": 0.010959716435242914\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7073540561031084,\n \"acc_stderr\": 0.012532334368242885\n }\n}\n```", "repo_url": "https://huggingface.co/Gille/StrangeMerges_16-7B-slerp", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_02T03_08_22.269991", "path": ["**/details_harness|arc:challenge|25_2024-02-02T03-08-22.269991.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-02T03-08-22.269991.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_02T03_08_22.269991", "path": ["**/details_harness|gsm8k|5_2024-02-02T03-08-22.269991.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-02T03-08-22.269991.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_02T03_08_22.269991", "path": ["**/details_harness|hellaswag|10_2024-02-02T03-08-22.269991.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-02T03-08-22.269991.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_02T03_08_22.269991", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T03-08-22.269991.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-02T03-08-22.269991.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-02T03-08-22.269991.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T03-08-22.269991.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T03-08-22.269991.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-02T03-08-22.269991.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T03-08-22.269991.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T03-08-22.269991.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T03-08-22.269991.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T03-08-22.269991.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-02T03-08-22.269991.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-02T03-08-22.269991.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T03-08-22.269991.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-02T03-08-22.269991.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T03-08-22.269991.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T03-08-22.269991.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T03-08-22.269991.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-02T03-08-22.269991.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T03-08-22.269991.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T03-08-22.269991.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T03-08-22.269991.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T03-08-22.269991.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T03-08-22.269991.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T03-08-22.269991.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T03-08-22.269991.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T03-08-22.269991.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T03-08-22.269991.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T03-08-22.269991.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T03-08-22.269991.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T03-08-22.269991.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T03-08-22.269991.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T03-08-22.269991.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-02T03-08-22.269991.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T03-08-22.269991.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-02T03-08-22.269991.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T03-08-22.269991.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T03-08-22.269991.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T03-08-22.269991.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-02T03-08-22.269991.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-02T03-08-22.269991.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T03-08-22.269991.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T03-08-22.269991.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T03-08-22.269991.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T03-08-22.269991.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-02T03-08-22.269991.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-02T03-08-22.269991.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-02T03-08-22.269991.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T03-08-22.269991.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-02T03-08-22.269991.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T03-08-22.269991.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T03-08-22.269991.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-02T03-08-22.269991.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-02T03-08-22.269991.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-02T03-08-22.269991.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T03-08-22.269991.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-02T03-08-22.269991.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-02T03-08-22.269991.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T03-08-22.269991.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-02T03-08-22.269991.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-02T03-08-22.269991.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T03-08-22.269991.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T03-08-22.269991.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-02T03-08-22.269991.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T03-08-22.269991.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T03-08-22.269991.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T03-08-22.269991.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T03-08-22.269991.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-02T03-08-22.269991.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-02T03-08-22.269991.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T03-08-22.269991.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-02T03-08-22.269991.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T03-08-22.269991.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T03-08-22.269991.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T03-08-22.269991.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-02T03-08-22.269991.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T03-08-22.269991.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T03-08-22.269991.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T03-08-22.269991.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T03-08-22.269991.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T03-08-22.269991.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T03-08-22.269991.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T03-08-22.269991.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T03-08-22.269991.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T03-08-22.269991.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T03-08-22.269991.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T03-08-22.269991.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T03-08-22.269991.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T03-08-22.269991.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T03-08-22.269991.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-02T03-08-22.269991.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T03-08-22.269991.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-02T03-08-22.269991.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T03-08-22.269991.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T03-08-22.269991.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T03-08-22.269991.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-02T03-08-22.269991.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-02T03-08-22.269991.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T03-08-22.269991.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T03-08-22.269991.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T03-08-22.269991.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T03-08-22.269991.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-02T03-08-22.269991.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-02T03-08-22.269991.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-02T03-08-22.269991.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T03-08-22.269991.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-02T03-08-22.269991.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T03-08-22.269991.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T03-08-22.269991.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-02T03-08-22.269991.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-02T03-08-22.269991.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-02T03-08-22.269991.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T03-08-22.269991.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-02T03-08-22.269991.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-02T03-08-22.269991.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_02T03_08_22.269991", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T03-08-22.269991.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T03-08-22.269991.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_02T03_08_22.269991", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-02T03-08-22.269991.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-02T03-08-22.269991.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_02T03_08_22.269991", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-02T03-08-22.269991.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-02T03-08-22.269991.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_02T03_08_22.269991", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T03-08-22.269991.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T03-08-22.269991.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_02T03_08_22.269991", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T03-08-22.269991.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T03-08-22.269991.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_02T03_08_22.269991", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-02T03-08-22.269991.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-02T03-08-22.269991.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_02T03_08_22.269991", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T03-08-22.269991.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T03-08-22.269991.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_02T03_08_22.269991", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T03-08-22.269991.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T03-08-22.269991.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_02T03_08_22.269991", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T03-08-22.269991.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T03-08-22.269991.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_02T03_08_22.269991", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T03-08-22.269991.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T03-08-22.269991.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_02T03_08_22.269991", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-02T03-08-22.269991.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-02T03-08-22.269991.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_02T03_08_22.269991", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-02T03-08-22.269991.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-02T03-08-22.269991.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_02T03_08_22.269991", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T03-08-22.269991.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T03-08-22.269991.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_02T03_08_22.269991", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-02T03-08-22.269991.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-02T03-08-22.269991.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_02T03_08_22.269991", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T03-08-22.269991.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T03-08-22.269991.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_02T03_08_22.269991", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T03-08-22.269991.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T03-08-22.269991.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_02T03_08_22.269991", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T03-08-22.269991.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T03-08-22.269991.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_02T03_08_22.269991", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-02T03-08-22.269991.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-02T03-08-22.269991.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_02T03_08_22.269991", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T03-08-22.269991.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T03-08-22.269991.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_02T03_08_22.269991", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T03-08-22.269991.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T03-08-22.269991.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_02T03_08_22.269991", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T03-08-22.269991.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T03-08-22.269991.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_02T03_08_22.269991", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T03-08-22.269991.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T03-08-22.269991.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_02T03_08_22.269991", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T03-08-22.269991.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T03-08-22.269991.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_02T03_08_22.269991", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T03-08-22.269991.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T03-08-22.269991.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_02T03_08_22.269991", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T03-08-22.269991.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T03-08-22.269991.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_02T03_08_22.269991", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T03-08-22.269991.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T03-08-22.269991.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_02T03_08_22.269991", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T03-08-22.269991.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T03-08-22.269991.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_02T03_08_22.269991", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T03-08-22.269991.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T03-08-22.269991.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_02T03_08_22.269991", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T03-08-22.269991.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T03-08-22.269991.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_02T03_08_22.269991", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T03-08-22.269991.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T03-08-22.269991.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_02T03_08_22.269991", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T03-08-22.269991.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T03-08-22.269991.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_02T03_08_22.269991", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T03-08-22.269991.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T03-08-22.269991.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_02T03_08_22.269991", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-02T03-08-22.269991.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-02T03-08-22.269991.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_02T03_08_22.269991", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T03-08-22.269991.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T03-08-22.269991.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_02T03_08_22.269991", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-02T03-08-22.269991.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-02T03-08-22.269991.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_02T03_08_22.269991", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T03-08-22.269991.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T03-08-22.269991.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_02T03_08_22.269991", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T03-08-22.269991.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T03-08-22.269991.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_02T03_08_22.269991", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T03-08-22.269991.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T03-08-22.269991.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_02T03_08_22.269991", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-02T03-08-22.269991.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-02T03-08-22.269991.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_02T03_08_22.269991", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-02T03-08-22.269991.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-02T03-08-22.269991.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_02T03_08_22.269991", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T03-08-22.269991.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T03-08-22.269991.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_02T03_08_22.269991", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T03-08-22.269991.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T03-08-22.269991.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_02T03_08_22.269991", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T03-08-22.269991.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T03-08-22.269991.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_02T03_08_22.269991", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T03-08-22.269991.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T03-08-22.269991.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_02T03_08_22.269991", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-02T03-08-22.269991.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-02T03-08-22.269991.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_02T03_08_22.269991", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-02T03-08-22.269991.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-02T03-08-22.269991.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_02T03_08_22.269991", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-02T03-08-22.269991.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-02T03-08-22.269991.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_02T03_08_22.269991", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T03-08-22.269991.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T03-08-22.269991.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_02T03_08_22.269991", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-02T03-08-22.269991.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-02T03-08-22.269991.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_02T03_08_22.269991", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T03-08-22.269991.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T03-08-22.269991.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_02T03_08_22.269991", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T03-08-22.269991.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T03-08-22.269991.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_02T03_08_22.269991", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-02T03-08-22.269991.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-02T03-08-22.269991.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_02T03_08_22.269991", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-02T03-08-22.269991.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-02T03-08-22.269991.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_02T03_08_22.269991", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-02T03-08-22.269991.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-02T03-08-22.269991.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_02T03_08_22.269991", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T03-08-22.269991.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T03-08-22.269991.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_02T03_08_22.269991", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-02T03-08-22.269991.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-02T03-08-22.269991.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_02T03_08_22.269991", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-02T03-08-22.269991.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-02T03-08-22.269991.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_02T03_08_22.269991", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-02T03-08-22.269991.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-02T03-08-22.269991.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_02T03_08_22.269991", "path": ["**/details_harness|winogrande|5_2024-02-02T03-08-22.269991.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-02T03-08-22.269991.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_02T03_08_22.269991", "path": ["results_2024-02-02T03-08-22.269991.parquet"]}, {"split": "latest", "path": ["results_2024-02-02T03-08-22.269991.parquet"]}]}]}
2024-02-02T03:11:06+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of Gille/StrangeMerges_16-7B-slerp Dataset automatically created during the evaluation run of model Gille/StrangeMerges_16-7B-slerp on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-02-02T03:08:22.269991(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of Gille/StrangeMerges_16-7B-slerp\n\n\n\nDataset automatically created during the evaluation run of model Gille/StrangeMerges_16-7B-slerp on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-02-02T03:08:22.269991(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of Gille/StrangeMerges_16-7B-slerp\n\n\n\nDataset automatically created during the evaluation run of model Gille/StrangeMerges_16-7B-slerp on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-02-02T03:08:22.269991(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
9fe08c3ffaaf2a0551173bdd61da99a786c161b4
# Danbooru SFW 512 Filtered and Cropped A version of Danbooru SFW which has been automatically filtered and cropped to 128x128 so that the resulting images focus almost entirely on characters. First filtering was applied to remove non-character focused images, then cropping was applied to remove horizontal/vertical bars and further focus on the central character(s) of each image. Both steps were performed automatically by two different vision models trained on manually labelled subsets of the original dataset. This technique provided satisfactory results, however there is much room for improvement. ## Dataset Preprocessing - [db-sfw-512-character-filter-dataset](https://huggingface.co/datasets/hayden-donnelly/db-sfw-512-character-filter-dataset) was used to train the character filter model. - [db-sfw-512-crop-dataset](https://huggingface.co/datasets/hayden-donnelly/db-sfw-512-crop-dataset) was used to train the crop model. ## Original Dataset Citation ```bibtex @misc{danbooru2021, author={Anonymous and Danbooru community and Gwern Branwen}, title={Danbooru2021: A Large-Scale Crowdsourced and Tagged Anime Illustration Dataset}, howpublished={\url{https://gwern.net/danbooru2021}}, url={https://gwern.net/danbooru2021}, type={dataset}, year={2022}, month={January}, timestamp={2022-01-21}, note={Accessed: 2023-12-06} } ```
hayden-donnelly/db-sfw-128-filtered-and-cropped-dataset
[ "task_categories:image-classification", "task_categories:text-to-image", "task_categories:unconditional-image-generation", "size_categories:1M<n<10M", "region:us" ]
2024-02-02T03:11:30+00:00
{"size_categories": ["1M<n<10M"], "task_categories": ["image-classification", "text-to-image", "unconditional-image-generation"], "pretty_name": "Danbooru SFW 512 Filtered and Cropped Dataset"}
2024-02-02T08:09:21+00:00
[]
[]
TAGS #task_categories-image-classification #task_categories-text-to-image #task_categories-unconditional-image-generation #size_categories-1M<n<10M #region-us
# Danbooru SFW 512 Filtered and Cropped A version of Danbooru SFW which has been automatically filtered and cropped to 128x128 so that the resulting images focus almost entirely on characters. First filtering was applied to remove non-character focused images, then cropping was applied to remove horizontal/vertical bars and further focus on the central character(s) of each image. Both steps were performed automatically by two different vision models trained on manually labelled subsets of the original dataset. This technique provided satisfactory results, however there is much room for improvement. ## Dataset Preprocessing - db-sfw-512-character-filter-dataset was used to train the character filter model. - db-sfw-512-crop-dataset was used to train the crop model. ## Original Dataset Citation
[ "# Danbooru SFW 512 Filtered and Cropped\n\nA version of Danbooru SFW which has been automatically filtered and cropped to 128x128 so that the resulting images focus almost entirely on characters.\nFirst filtering was applied to remove non-character focused images, then cropping was applied to remove horizontal/vertical bars and further focus on the\ncentral character(s) of each image. Both steps were performed automatically by two different vision models trained on manually labelled subsets of the original dataset.\nThis technique provided satisfactory results, however there is much room for improvement.", "## Dataset Preprocessing\n\n- db-sfw-512-character-filter-dataset was used to train the character filter model. \n- db-sfw-512-crop-dataset was used to train the crop model.", "## Original Dataset Citation" ]
[ "TAGS\n#task_categories-image-classification #task_categories-text-to-image #task_categories-unconditional-image-generation #size_categories-1M<n<10M #region-us \n", "# Danbooru SFW 512 Filtered and Cropped\n\nA version of Danbooru SFW which has been automatically filtered and cropped to 128x128 so that the resulting images focus almost entirely on characters.\nFirst filtering was applied to remove non-character focused images, then cropping was applied to remove horizontal/vertical bars and further focus on the\ncentral character(s) of each image. Both steps were performed automatically by two different vision models trained on manually labelled subsets of the original dataset.\nThis technique provided satisfactory results, however there is much room for improvement.", "## Dataset Preprocessing\n\n- db-sfw-512-character-filter-dataset was used to train the character filter model. \n- db-sfw-512-crop-dataset was used to train the crop model.", "## Original Dataset Citation" ]
d804780c1e1e02fce5bfcbe70218be8d12e6619b
# Dataset Card for Evaluation run of Gille/StrangeMerges_3-7B-slerp <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [Gille/StrangeMerges_3-7B-slerp](https://huggingface.co/Gille/StrangeMerges_3-7B-slerp) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_Gille__StrangeMerges_3-7B-slerp", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-02-02T03:13:09.312794](https://huggingface.co/datasets/open-llm-leaderboard/details_Gille__StrangeMerges_3-7B-slerp/blob/main/results_2024-02-02T03-13-09.312794.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6563114190054364, "acc_stderr": 0.031977711703246356, "acc_norm": 0.656056587095913, "acc_norm_stderr": 0.03264168552070928, "mc1": 0.5275397796817626, "mc1_stderr": 0.01747693019071219, "mc2": 0.6885785972374051, "mc2_stderr": 0.014842898041557211 }, "harness|arc:challenge|25": { "acc": 0.6697952218430034, "acc_stderr": 0.013743085603760424, "acc_norm": 0.7081911262798635, "acc_norm_stderr": 0.013284525292403518 }, "harness|hellaswag|10": { "acc": 0.6958773152758415, "acc_stderr": 0.004590946839727177, "acc_norm": 0.877912766381199, "acc_norm_stderr": 0.00326717445844976 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.36, "acc_stderr": 0.04824181513244218, "acc_norm": 0.36, "acc_norm_stderr": 0.04824181513244218 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.6592592592592592, "acc_stderr": 0.04094376269996792, "acc_norm": 0.6592592592592592, "acc_norm_stderr": 0.04094376269996792 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.6842105263157895, "acc_stderr": 0.0378272898086547, "acc_norm": 0.6842105263157895, "acc_norm_stderr": 0.0378272898086547 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.64, "acc_stderr": 0.04824181513244218, "acc_norm": 0.64, "acc_norm_stderr": 0.04824181513244218 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.7169811320754716, "acc_stderr": 0.027724236492700914, "acc_norm": 0.7169811320754716, "acc_norm_stderr": 0.027724236492700914 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.7777777777777778, "acc_stderr": 0.03476590104304134, "acc_norm": 0.7777777777777778, "acc_norm_stderr": 0.03476590104304134 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.48, "acc_stderr": 0.050211673156867795, "acc_norm": 0.48, "acc_norm_stderr": 0.050211673156867795 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.54, "acc_stderr": 0.05009082659620333, "acc_norm": 0.54, "acc_norm_stderr": 0.05009082659620333 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.31, "acc_stderr": 0.04648231987117316, "acc_norm": 0.31, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6878612716763006, "acc_stderr": 0.03533133389323657, "acc_norm": 0.6878612716763006, "acc_norm_stderr": 0.03533133389323657 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.4411764705882353, "acc_stderr": 0.049406356306056595, "acc_norm": 0.4411764705882353, "acc_norm_stderr": 0.049406356306056595 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.77, "acc_stderr": 0.042295258468165065, "acc_norm": 0.77, "acc_norm_stderr": 0.042295258468165065 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.5872340425531914, "acc_stderr": 0.03218471141400351, "acc_norm": 0.5872340425531914, "acc_norm_stderr": 0.03218471141400351 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.49122807017543857, "acc_stderr": 0.04702880432049615, "acc_norm": 0.49122807017543857, "acc_norm_stderr": 0.04702880432049615 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5586206896551724, "acc_stderr": 0.04137931034482757, "acc_norm": 0.5586206896551724, "acc_norm_stderr": 0.04137931034482757 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.42592592592592593, "acc_stderr": 0.025467149045469553, "acc_norm": 0.42592592592592593, "acc_norm_stderr": 0.025467149045469553 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.46825396825396826, "acc_stderr": 0.04463112720677171, "acc_norm": 0.46825396825396826, "acc_norm_stderr": 0.04463112720677171 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.33, "acc_stderr": 0.047258156262526045, "acc_norm": 0.33, "acc_norm_stderr": 0.047258156262526045 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.7838709677419354, "acc_stderr": 0.02341529343356852, "acc_norm": 0.7838709677419354, "acc_norm_stderr": 0.02341529343356852 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.46798029556650245, "acc_stderr": 0.035107665979592154, "acc_norm": 0.46798029556650245, "acc_norm_stderr": 0.035107665979592154 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.7, "acc_stderr": 0.046056618647183814, "acc_norm": 0.7, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7757575757575758, "acc_stderr": 0.03256866661681102, "acc_norm": 0.7757575757575758, "acc_norm_stderr": 0.03256866661681102 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.803030303030303, "acc_stderr": 0.028335609732463362, "acc_norm": 0.803030303030303, "acc_norm_stderr": 0.028335609732463362 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.8911917098445595, "acc_stderr": 0.02247325333276877, "acc_norm": 0.8911917098445595, "acc_norm_stderr": 0.02247325333276877 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.6641025641025641, "acc_stderr": 0.023946724741563973, "acc_norm": 0.6641025641025641, "acc_norm_stderr": 0.023946724741563973 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.37407407407407406, "acc_stderr": 0.02950286112895529, "acc_norm": 0.37407407407407406, "acc_norm_stderr": 0.02950286112895529 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.680672268907563, "acc_stderr": 0.030283995525884396, "acc_norm": 0.680672268907563, "acc_norm_stderr": 0.030283995525884396 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.3576158940397351, "acc_stderr": 0.03913453431177258, "acc_norm": 0.3576158940397351, "acc_norm_stderr": 0.03913453431177258 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8513761467889909, "acc_stderr": 0.015251253773660834, "acc_norm": 0.8513761467889909, "acc_norm_stderr": 0.015251253773660834 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.49537037037037035, "acc_stderr": 0.03409825519163572, "acc_norm": 0.49537037037037035, "acc_norm_stderr": 0.03409825519163572 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.8431372549019608, "acc_stderr": 0.02552472232455335, "acc_norm": 0.8431372549019608, "acc_norm_stderr": 0.02552472232455335 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.810126582278481, "acc_stderr": 0.02553010046023349, "acc_norm": 0.810126582278481, "acc_norm_stderr": 0.02553010046023349 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6816143497757847, "acc_stderr": 0.03126580522513713, "acc_norm": 0.6816143497757847, "acc_norm_stderr": 0.03126580522513713 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.8015267175572519, "acc_stderr": 0.03498149385462472, "acc_norm": 0.8015267175572519, "acc_norm_stderr": 0.03498149385462472 }, "harness|hendrycksTest-international_law|5": { "acc": 0.8016528925619835, "acc_stderr": 0.03640118271990946, "acc_norm": 0.8016528925619835, "acc_norm_stderr": 0.03640118271990946 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7685185185185185, "acc_stderr": 0.04077494709252626, "acc_norm": 0.7685185185185185, "acc_norm_stderr": 0.04077494709252626 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7668711656441718, "acc_stderr": 0.0332201579577674, "acc_norm": 0.7668711656441718, "acc_norm_stderr": 0.0332201579577674 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.4642857142857143, "acc_stderr": 0.04733667890053756, "acc_norm": 0.4642857142857143, "acc_norm_stderr": 0.04733667890053756 }, "harness|hendrycksTest-management|5": { "acc": 0.7864077669902912, "acc_stderr": 0.040580420156460344, "acc_norm": 0.7864077669902912, "acc_norm_stderr": 0.040580420156460344 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8846153846153846, "acc_stderr": 0.02093019318517933, "acc_norm": 0.8846153846153846, "acc_norm_stderr": 0.02093019318517933 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.72, "acc_stderr": 0.045126085985421276, "acc_norm": 0.72, "acc_norm_stderr": 0.045126085985421276 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8326947637292464, "acc_stderr": 0.013347327202920332, "acc_norm": 0.8326947637292464, "acc_norm_stderr": 0.013347327202920332 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.7514450867052023, "acc_stderr": 0.023267528432100174, "acc_norm": 0.7514450867052023, "acc_norm_stderr": 0.023267528432100174 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.42569832402234636, "acc_stderr": 0.016536829648997112, "acc_norm": 0.42569832402234636, "acc_norm_stderr": 0.016536829648997112 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.7124183006535948, "acc_stderr": 0.02591780611714716, "acc_norm": 0.7124183006535948, "acc_norm_stderr": 0.02591780611714716 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.7202572347266881, "acc_stderr": 0.025494259350694912, "acc_norm": 0.7202572347266881, "acc_norm_stderr": 0.025494259350694912 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.7469135802469136, "acc_stderr": 0.024191808600712995, "acc_norm": 0.7469135802469136, "acc_norm_stderr": 0.024191808600712995 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.4858156028368794, "acc_stderr": 0.02981549448368206, "acc_norm": 0.4858156028368794, "acc_norm_stderr": 0.02981549448368206 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.4680573663624511, "acc_stderr": 0.012744149704869649, "acc_norm": 0.4680573663624511, "acc_norm_stderr": 0.012744149704869649 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.6801470588235294, "acc_stderr": 0.02833295951403121, "acc_norm": 0.6801470588235294, "acc_norm_stderr": 0.02833295951403121 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6748366013071896, "acc_stderr": 0.018950886770806315, "acc_norm": 0.6748366013071896, "acc_norm_stderr": 0.018950886770806315 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6909090909090909, "acc_stderr": 0.044262946482000985, "acc_norm": 0.6909090909090909, "acc_norm_stderr": 0.044262946482000985 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.7346938775510204, "acc_stderr": 0.028263889943784593, "acc_norm": 0.7346938775510204, "acc_norm_stderr": 0.028263889943784593 }, "harness|hendrycksTest-sociology|5": { "acc": 0.8507462686567164, "acc_stderr": 0.02519692987482706, "acc_norm": 0.8507462686567164, "acc_norm_stderr": 0.02519692987482706 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.86, "acc_stderr": 0.03487350880197769, "acc_norm": 0.86, "acc_norm_stderr": 0.03487350880197769 }, "harness|hendrycksTest-virology|5": { "acc": 0.5421686746987951, "acc_stderr": 0.0387862677100236, "acc_norm": 0.5421686746987951, "acc_norm_stderr": 0.0387862677100236 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8421052631578947, "acc_stderr": 0.027966785859160893, "acc_norm": 0.8421052631578947, "acc_norm_stderr": 0.027966785859160893 }, "harness|truthfulqa:mc|0": { "mc1": 0.5275397796817626, "mc1_stderr": 0.01747693019071219, "mc2": 0.6885785972374051, "mc2_stderr": 0.014842898041557211 }, "harness|winogrande|5": { "acc": 0.8255722178374112, "acc_stderr": 0.010665187902498438 }, "harness|gsm8k|5": { "acc": 0.7225170583775588, "acc_stderr": 0.01233344758104755 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_Gille__StrangeMerges_3-7B-slerp
[ "region:us" ]
2024-02-02T03:15:33+00:00
{"pretty_name": "Evaluation run of Gille/StrangeMerges_3-7B-slerp", "dataset_summary": "Dataset automatically created during the evaluation run of model [Gille/StrangeMerges_3-7B-slerp](https://huggingface.co/Gille/StrangeMerges_3-7B-slerp) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Gille__StrangeMerges_3-7B-slerp\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-02T03:13:09.312794](https://huggingface.co/datasets/open-llm-leaderboard/details_Gille__StrangeMerges_3-7B-slerp/blob/main/results_2024-02-02T03-13-09.312794.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6563114190054364,\n \"acc_stderr\": 0.031977711703246356,\n \"acc_norm\": 0.656056587095913,\n \"acc_norm_stderr\": 0.03264168552070928,\n \"mc1\": 0.5275397796817626,\n \"mc1_stderr\": 0.01747693019071219,\n \"mc2\": 0.6885785972374051,\n \"mc2_stderr\": 0.014842898041557211\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6697952218430034,\n \"acc_stderr\": 0.013743085603760424,\n \"acc_norm\": 0.7081911262798635,\n \"acc_norm_stderr\": 0.013284525292403518\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6958773152758415,\n \"acc_stderr\": 0.004590946839727177,\n \"acc_norm\": 0.877912766381199,\n \"acc_norm_stderr\": 0.00326717445844976\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6592592592592592,\n \"acc_stderr\": 0.04094376269996792,\n \"acc_norm\": 0.6592592592592592,\n \"acc_norm_stderr\": 0.04094376269996792\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6842105263157895,\n \"acc_stderr\": 0.0378272898086547,\n \"acc_norm\": 0.6842105263157895,\n \"acc_norm_stderr\": 0.0378272898086547\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7169811320754716,\n \"acc_stderr\": 0.027724236492700914,\n \"acc_norm\": 0.7169811320754716,\n \"acc_norm_stderr\": 0.027724236492700914\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.03476590104304134,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.03476590104304134\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6878612716763006,\n \"acc_stderr\": 0.03533133389323657,\n \"acc_norm\": 0.6878612716763006,\n \"acc_norm_stderr\": 0.03533133389323657\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4411764705882353,\n \"acc_stderr\": 0.049406356306056595,\n \"acc_norm\": 0.4411764705882353,\n \"acc_norm_stderr\": 0.049406356306056595\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.77,\n \"acc_stderr\": 0.042295258468165065,\n \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.042295258468165065\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5872340425531914,\n \"acc_stderr\": 0.03218471141400351,\n \"acc_norm\": 0.5872340425531914,\n \"acc_norm_stderr\": 0.03218471141400351\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.49122807017543857,\n \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.49122807017543857,\n \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5586206896551724,\n \"acc_stderr\": 0.04137931034482757,\n \"acc_norm\": 0.5586206896551724,\n \"acc_norm_stderr\": 0.04137931034482757\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.42592592592592593,\n \"acc_stderr\": 0.025467149045469553,\n \"acc_norm\": 0.42592592592592593,\n \"acc_norm_stderr\": 0.025467149045469553\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.46825396825396826,\n \"acc_stderr\": 0.04463112720677171,\n \"acc_norm\": 0.46825396825396826,\n \"acc_norm_stderr\": 0.04463112720677171\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7838709677419354,\n \"acc_stderr\": 0.02341529343356852,\n \"acc_norm\": 0.7838709677419354,\n \"acc_norm_stderr\": 0.02341529343356852\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.46798029556650245,\n \"acc_stderr\": 0.035107665979592154,\n \"acc_norm\": 0.46798029556650245,\n \"acc_norm_stderr\": 0.035107665979592154\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.03256866661681102,\n \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.03256866661681102\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.803030303030303,\n \"acc_stderr\": 0.028335609732463362,\n \"acc_norm\": 0.803030303030303,\n \"acc_norm_stderr\": 0.028335609732463362\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8911917098445595,\n \"acc_stderr\": 0.02247325333276877,\n \"acc_norm\": 0.8911917098445595,\n \"acc_norm_stderr\": 0.02247325333276877\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6641025641025641,\n \"acc_stderr\": 0.023946724741563973,\n \"acc_norm\": 0.6641025641025641,\n \"acc_norm_stderr\": 0.023946724741563973\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.37407407407407406,\n \"acc_stderr\": 0.02950286112895529,\n \"acc_norm\": 0.37407407407407406,\n \"acc_norm_stderr\": 0.02950286112895529\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.680672268907563,\n \"acc_stderr\": 0.030283995525884396,\n \"acc_norm\": 0.680672268907563,\n \"acc_norm_stderr\": 0.030283995525884396\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3576158940397351,\n \"acc_stderr\": 0.03913453431177258,\n \"acc_norm\": 0.3576158940397351,\n \"acc_norm_stderr\": 0.03913453431177258\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8513761467889909,\n \"acc_stderr\": 0.015251253773660834,\n \"acc_norm\": 0.8513761467889909,\n \"acc_norm_stderr\": 0.015251253773660834\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.49537037037037035,\n \"acc_stderr\": 0.03409825519163572,\n \"acc_norm\": 0.49537037037037035,\n \"acc_norm_stderr\": 0.03409825519163572\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8431372549019608,\n \"acc_stderr\": 0.02552472232455335,\n \"acc_norm\": 0.8431372549019608,\n \"acc_norm_stderr\": 0.02552472232455335\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.810126582278481,\n \"acc_stderr\": 0.02553010046023349,\n \"acc_norm\": 0.810126582278481,\n \"acc_norm_stderr\": 0.02553010046023349\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6816143497757847,\n \"acc_stderr\": 0.03126580522513713,\n \"acc_norm\": 0.6816143497757847,\n \"acc_norm_stderr\": 0.03126580522513713\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8015267175572519,\n \"acc_stderr\": 0.03498149385462472,\n \"acc_norm\": 0.8015267175572519,\n \"acc_norm_stderr\": 0.03498149385462472\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8016528925619835,\n \"acc_stderr\": 0.03640118271990946,\n \"acc_norm\": 0.8016528925619835,\n \"acc_norm_stderr\": 0.03640118271990946\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n \"acc_stderr\": 0.04077494709252626,\n \"acc_norm\": 0.7685185185185185,\n \"acc_norm_stderr\": 0.04077494709252626\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7668711656441718,\n \"acc_stderr\": 0.0332201579577674,\n \"acc_norm\": 0.7668711656441718,\n \"acc_norm_stderr\": 0.0332201579577674\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4642857142857143,\n \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.4642857142857143,\n \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7864077669902912,\n \"acc_stderr\": 0.040580420156460344,\n \"acc_norm\": 0.7864077669902912,\n \"acc_norm_stderr\": 0.040580420156460344\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8846153846153846,\n \"acc_stderr\": 0.02093019318517933,\n \"acc_norm\": 0.8846153846153846,\n \"acc_norm_stderr\": 0.02093019318517933\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8326947637292464,\n \"acc_stderr\": 0.013347327202920332,\n \"acc_norm\": 0.8326947637292464,\n \"acc_norm_stderr\": 0.013347327202920332\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7514450867052023,\n \"acc_stderr\": 0.023267528432100174,\n \"acc_norm\": 0.7514450867052023,\n \"acc_norm_stderr\": 0.023267528432100174\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.42569832402234636,\n \"acc_stderr\": 0.016536829648997112,\n \"acc_norm\": 0.42569832402234636,\n \"acc_norm_stderr\": 0.016536829648997112\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7124183006535948,\n \"acc_stderr\": 0.02591780611714716,\n \"acc_norm\": 0.7124183006535948,\n \"acc_norm_stderr\": 0.02591780611714716\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7202572347266881,\n \"acc_stderr\": 0.025494259350694912,\n \"acc_norm\": 0.7202572347266881,\n \"acc_norm_stderr\": 0.025494259350694912\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7469135802469136,\n \"acc_stderr\": 0.024191808600712995,\n \"acc_norm\": 0.7469135802469136,\n \"acc_norm_stderr\": 0.024191808600712995\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4858156028368794,\n \"acc_stderr\": 0.02981549448368206,\n \"acc_norm\": 0.4858156028368794,\n \"acc_norm_stderr\": 0.02981549448368206\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4680573663624511,\n \"acc_stderr\": 0.012744149704869649,\n \"acc_norm\": 0.4680573663624511,\n \"acc_norm_stderr\": 0.012744149704869649\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6801470588235294,\n \"acc_stderr\": 0.02833295951403121,\n \"acc_norm\": 0.6801470588235294,\n \"acc_norm_stderr\": 0.02833295951403121\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6748366013071896,\n \"acc_stderr\": 0.018950886770806315,\n \"acc_norm\": 0.6748366013071896,\n \"acc_norm_stderr\": 0.018950886770806315\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6909090909090909,\n \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.6909090909090909,\n \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7346938775510204,\n \"acc_stderr\": 0.028263889943784593,\n \"acc_norm\": 0.7346938775510204,\n \"acc_norm_stderr\": 0.028263889943784593\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8507462686567164,\n \"acc_stderr\": 0.02519692987482706,\n \"acc_norm\": 0.8507462686567164,\n \"acc_norm_stderr\": 0.02519692987482706\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.86,\n \"acc_stderr\": 0.03487350880197769,\n \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.03487350880197769\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5421686746987951,\n \"acc_stderr\": 0.0387862677100236,\n \"acc_norm\": 0.5421686746987951,\n \"acc_norm_stderr\": 0.0387862677100236\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8421052631578947,\n \"acc_stderr\": 0.027966785859160893,\n \"acc_norm\": 0.8421052631578947,\n \"acc_norm_stderr\": 0.027966785859160893\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5275397796817626,\n \"mc1_stderr\": 0.01747693019071219,\n \"mc2\": 0.6885785972374051,\n \"mc2_stderr\": 0.014842898041557211\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8255722178374112,\n \"acc_stderr\": 0.010665187902498438\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7225170583775588,\n \"acc_stderr\": 0.01233344758104755\n }\n}\n```", "repo_url": "https://huggingface.co/Gille/StrangeMerges_3-7B-slerp", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_02T03_13_09.312794", "path": ["**/details_harness|arc:challenge|25_2024-02-02T03-13-09.312794.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-02T03-13-09.312794.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_02T03_13_09.312794", "path": ["**/details_harness|gsm8k|5_2024-02-02T03-13-09.312794.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-02T03-13-09.312794.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_02T03_13_09.312794", "path": ["**/details_harness|hellaswag|10_2024-02-02T03-13-09.312794.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-02T03-13-09.312794.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_02T03_13_09.312794", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T03-13-09.312794.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-02T03-13-09.312794.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-02T03-13-09.312794.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T03-13-09.312794.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T03-13-09.312794.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-02T03-13-09.312794.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T03-13-09.312794.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T03-13-09.312794.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T03-13-09.312794.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T03-13-09.312794.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-02T03-13-09.312794.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-02T03-13-09.312794.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T03-13-09.312794.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-02T03-13-09.312794.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T03-13-09.312794.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T03-13-09.312794.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T03-13-09.312794.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-02T03-13-09.312794.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T03-13-09.312794.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T03-13-09.312794.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T03-13-09.312794.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T03-13-09.312794.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T03-13-09.312794.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T03-13-09.312794.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T03-13-09.312794.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T03-13-09.312794.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T03-13-09.312794.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T03-13-09.312794.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T03-13-09.312794.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T03-13-09.312794.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T03-13-09.312794.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T03-13-09.312794.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-02T03-13-09.312794.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T03-13-09.312794.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-02T03-13-09.312794.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T03-13-09.312794.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T03-13-09.312794.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T03-13-09.312794.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-02T03-13-09.312794.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-02T03-13-09.312794.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T03-13-09.312794.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T03-13-09.312794.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T03-13-09.312794.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T03-13-09.312794.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-02T03-13-09.312794.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-02T03-13-09.312794.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-02T03-13-09.312794.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T03-13-09.312794.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-02T03-13-09.312794.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T03-13-09.312794.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T03-13-09.312794.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-02T03-13-09.312794.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-02T03-13-09.312794.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-02T03-13-09.312794.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T03-13-09.312794.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-02T03-13-09.312794.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-02T03-13-09.312794.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T03-13-09.312794.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-02T03-13-09.312794.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-02T03-13-09.312794.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T03-13-09.312794.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T03-13-09.312794.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-02T03-13-09.312794.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T03-13-09.312794.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T03-13-09.312794.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T03-13-09.312794.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T03-13-09.312794.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-02T03-13-09.312794.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-02T03-13-09.312794.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T03-13-09.312794.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-02T03-13-09.312794.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T03-13-09.312794.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T03-13-09.312794.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T03-13-09.312794.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-02T03-13-09.312794.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T03-13-09.312794.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T03-13-09.312794.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T03-13-09.312794.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T03-13-09.312794.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T03-13-09.312794.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T03-13-09.312794.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T03-13-09.312794.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T03-13-09.312794.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T03-13-09.312794.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T03-13-09.312794.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T03-13-09.312794.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T03-13-09.312794.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T03-13-09.312794.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T03-13-09.312794.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-02T03-13-09.312794.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T03-13-09.312794.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-02T03-13-09.312794.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T03-13-09.312794.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T03-13-09.312794.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T03-13-09.312794.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-02T03-13-09.312794.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-02T03-13-09.312794.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T03-13-09.312794.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T03-13-09.312794.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T03-13-09.312794.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T03-13-09.312794.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-02T03-13-09.312794.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-02T03-13-09.312794.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-02T03-13-09.312794.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T03-13-09.312794.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-02T03-13-09.312794.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T03-13-09.312794.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T03-13-09.312794.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-02T03-13-09.312794.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-02T03-13-09.312794.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-02T03-13-09.312794.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T03-13-09.312794.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-02T03-13-09.312794.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-02T03-13-09.312794.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_02T03_13_09.312794", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T03-13-09.312794.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T03-13-09.312794.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_02T03_13_09.312794", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-02T03-13-09.312794.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-02T03-13-09.312794.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_02T03_13_09.312794", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-02T03-13-09.312794.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-02T03-13-09.312794.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_02T03_13_09.312794", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T03-13-09.312794.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T03-13-09.312794.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_02T03_13_09.312794", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T03-13-09.312794.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T03-13-09.312794.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_02T03_13_09.312794", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-02T03-13-09.312794.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-02T03-13-09.312794.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_02T03_13_09.312794", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T03-13-09.312794.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T03-13-09.312794.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_02T03_13_09.312794", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T03-13-09.312794.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T03-13-09.312794.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_02T03_13_09.312794", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T03-13-09.312794.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T03-13-09.312794.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_02T03_13_09.312794", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T03-13-09.312794.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T03-13-09.312794.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_02T03_13_09.312794", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-02T03-13-09.312794.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-02T03-13-09.312794.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_02T03_13_09.312794", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-02T03-13-09.312794.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-02T03-13-09.312794.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_02T03_13_09.312794", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T03-13-09.312794.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T03-13-09.312794.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_02T03_13_09.312794", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-02T03-13-09.312794.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-02T03-13-09.312794.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_02T03_13_09.312794", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T03-13-09.312794.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T03-13-09.312794.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_02T03_13_09.312794", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T03-13-09.312794.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T03-13-09.312794.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_02T03_13_09.312794", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T03-13-09.312794.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T03-13-09.312794.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_02T03_13_09.312794", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-02T03-13-09.312794.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-02T03-13-09.312794.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_02T03_13_09.312794", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T03-13-09.312794.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T03-13-09.312794.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_02T03_13_09.312794", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T03-13-09.312794.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T03-13-09.312794.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_02T03_13_09.312794", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T03-13-09.312794.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T03-13-09.312794.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_02T03_13_09.312794", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T03-13-09.312794.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T03-13-09.312794.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_02T03_13_09.312794", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T03-13-09.312794.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T03-13-09.312794.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_02T03_13_09.312794", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T03-13-09.312794.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T03-13-09.312794.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_02T03_13_09.312794", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T03-13-09.312794.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T03-13-09.312794.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_02T03_13_09.312794", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T03-13-09.312794.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T03-13-09.312794.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_02T03_13_09.312794", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T03-13-09.312794.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T03-13-09.312794.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_02T03_13_09.312794", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T03-13-09.312794.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T03-13-09.312794.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_02T03_13_09.312794", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T03-13-09.312794.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T03-13-09.312794.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_02T03_13_09.312794", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T03-13-09.312794.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T03-13-09.312794.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_02T03_13_09.312794", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T03-13-09.312794.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T03-13-09.312794.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_02T03_13_09.312794", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T03-13-09.312794.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T03-13-09.312794.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_02T03_13_09.312794", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-02T03-13-09.312794.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-02T03-13-09.312794.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_02T03_13_09.312794", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T03-13-09.312794.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T03-13-09.312794.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_02T03_13_09.312794", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-02T03-13-09.312794.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-02T03-13-09.312794.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_02T03_13_09.312794", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T03-13-09.312794.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T03-13-09.312794.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_02T03_13_09.312794", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T03-13-09.312794.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T03-13-09.312794.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_02T03_13_09.312794", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T03-13-09.312794.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T03-13-09.312794.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_02T03_13_09.312794", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-02T03-13-09.312794.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-02T03-13-09.312794.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_02T03_13_09.312794", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-02T03-13-09.312794.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-02T03-13-09.312794.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_02T03_13_09.312794", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T03-13-09.312794.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T03-13-09.312794.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_02T03_13_09.312794", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T03-13-09.312794.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T03-13-09.312794.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_02T03_13_09.312794", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T03-13-09.312794.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T03-13-09.312794.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_02T03_13_09.312794", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T03-13-09.312794.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T03-13-09.312794.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_02T03_13_09.312794", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-02T03-13-09.312794.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-02T03-13-09.312794.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_02T03_13_09.312794", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-02T03-13-09.312794.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-02T03-13-09.312794.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_02T03_13_09.312794", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-02T03-13-09.312794.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-02T03-13-09.312794.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_02T03_13_09.312794", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T03-13-09.312794.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T03-13-09.312794.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_02T03_13_09.312794", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-02T03-13-09.312794.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-02T03-13-09.312794.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_02T03_13_09.312794", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T03-13-09.312794.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T03-13-09.312794.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_02T03_13_09.312794", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T03-13-09.312794.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T03-13-09.312794.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_02T03_13_09.312794", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-02T03-13-09.312794.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-02T03-13-09.312794.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_02T03_13_09.312794", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-02T03-13-09.312794.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-02T03-13-09.312794.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_02T03_13_09.312794", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-02T03-13-09.312794.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-02T03-13-09.312794.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_02T03_13_09.312794", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T03-13-09.312794.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T03-13-09.312794.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_02T03_13_09.312794", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-02T03-13-09.312794.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-02T03-13-09.312794.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_02T03_13_09.312794", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-02T03-13-09.312794.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-02T03-13-09.312794.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_02T03_13_09.312794", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-02T03-13-09.312794.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-02T03-13-09.312794.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_02T03_13_09.312794", "path": ["**/details_harness|winogrande|5_2024-02-02T03-13-09.312794.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-02T03-13-09.312794.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_02T03_13_09.312794", "path": ["results_2024-02-02T03-13-09.312794.parquet"]}, {"split": "latest", "path": ["results_2024-02-02T03-13-09.312794.parquet"]}]}]}
2024-02-02T03:15:57+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of Gille/StrangeMerges_3-7B-slerp Dataset automatically created during the evaluation run of model Gille/StrangeMerges_3-7B-slerp on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-02-02T03:13:09.312794(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of Gille/StrangeMerges_3-7B-slerp\n\n\n\nDataset automatically created during the evaluation run of model Gille/StrangeMerges_3-7B-slerp on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-02-02T03:13:09.312794(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of Gille/StrangeMerges_3-7B-slerp\n\n\n\nDataset automatically created during the evaluation run of model Gille/StrangeMerges_3-7B-slerp on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-02-02T03:13:09.312794(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
db090e42b3e640d2b860c91eef7cff44cc7c94b2
# Dataset Card for Evaluation run of AA051611/V0201 <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [AA051611/V0201](https://huggingface.co/AA051611/V0201) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_AA051611__V0201", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-02-02T03:15:18.446534](https://huggingface.co/datasets/open-llm-leaderboard/details_AA051611__V0201/blob/main/results_2024-02-02T03-15-18.446534.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.8722899277105903, "acc_stderr": 0.021779827433248626, "acc_norm": 0.8832174168880055, "acc_norm_stderr": 0.022071903413890245, "mc1": 0.36474908200734396, "mc1_stderr": 0.01685096106172011, "mc2": 0.5375985523274007, "mc2_stderr": 0.015202763451961539 }, "harness|arc:challenge|25": { "acc": 0.6339590443686007, "acc_stderr": 0.014077223108470139, "acc_norm": 0.6723549488054608, "acc_norm_stderr": 0.013715847940719337 }, "harness|hellaswag|10": { "acc": 0.6309500099581756, "acc_stderr": 0.004815613144385407, "acc_norm": 0.8330013941445927, "acc_norm_stderr": 0.00372212370961046 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.76, "acc_stderr": 0.042923469599092816, "acc_norm": 0.76, "acc_norm_stderr": 0.042923469599092816 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.837037037037037, "acc_stderr": 0.03190541474482841, "acc_norm": 0.837037037037037, "acc_norm_stderr": 0.03190541474482841 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.9539473684210527, "acc_stderr": 0.01705693362806048, "acc_norm": 0.9539473684210527, "acc_norm_stderr": 0.01705693362806048 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.86, "acc_stderr": 0.0348735088019777, "acc_norm": 0.86, "acc_norm_stderr": 0.0348735088019777 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.9358490566037736, "acc_stderr": 0.015080038966069792, "acc_norm": 0.9358490566037736, "acc_norm_stderr": 0.015080038966069792 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.9652777777777778, "acc_stderr": 0.01530953117500374, "acc_norm": 0.9652777777777778, "acc_norm_stderr": 0.01530953117500374 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.75, "acc_stderr": 0.04351941398892446, "acc_norm": 0.75, "acc_norm_stderr": 0.04351941398892446 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.86, "acc_stderr": 0.03487350880197772, "acc_norm": 0.86, "acc_norm_stderr": 0.03487350880197772 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.78, "acc_stderr": 0.04163331998932262, "acc_norm": 0.78, "acc_norm_stderr": 0.04163331998932262 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.8786127167630058, "acc_stderr": 0.024901248066383764, "acc_norm": 0.8786127167630058, "acc_norm_stderr": 0.024901248066383764 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.7745098039215687, "acc_stderr": 0.04158307533083286, "acc_norm": 0.7745098039215687, "acc_norm_stderr": 0.04158307533083286 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.91, "acc_stderr": 0.028762349126466115, "acc_norm": 0.91, "acc_norm_stderr": 0.028762349126466115 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.8936170212765957, "acc_stderr": 0.02015597730704985, "acc_norm": 0.8936170212765957, "acc_norm_stderr": 0.02015597730704985 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.7894736842105263, "acc_stderr": 0.0383515395439942, "acc_norm": 0.7894736842105263, "acc_norm_stderr": 0.0383515395439942 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.8896551724137931, "acc_stderr": 0.026109923428966807, "acc_norm": 0.8896551724137931, "acc_norm_stderr": 0.026109923428966807 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.8862433862433863, "acc_stderr": 0.016352876480494796, "acc_norm": 0.8862433862433863, "acc_norm_stderr": 0.016352876480494796 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.7301587301587301, "acc_stderr": 0.03970158273235171, "acc_norm": 0.7301587301587301, "acc_norm_stderr": 0.03970158273235171 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.84, "acc_stderr": 0.036845294917747115, "acc_norm": 0.84, "acc_norm_stderr": 0.036845294917747115 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.9612903225806452, "acc_stderr": 0.010973819726797958, "acc_norm": 0.9612903225806452, "acc_norm_stderr": 0.010973819726797958 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.8078817733990148, "acc_stderr": 0.02771931570961478, "acc_norm": 0.8078817733990148, "acc_norm_stderr": 0.02771931570961478 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.91, "acc_stderr": 0.028762349126466115, "acc_norm": 0.91, "acc_norm_stderr": 0.028762349126466115 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.9212121212121213, "acc_stderr": 0.021037183825716357, "acc_norm": 0.9212121212121213, "acc_norm_stderr": 0.021037183825716357 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.9646464646464646, "acc_stderr": 0.01315731887804608, "acc_norm": 0.9646464646464646, "acc_norm_stderr": 0.01315731887804608 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.9844559585492227, "acc_stderr": 0.008927492715084346, "acc_norm": 0.9844559585492227, "acc_norm_stderr": 0.008927492715084346 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.9128205128205128, "acc_stderr": 0.014302931207177386, "acc_norm": 0.9128205128205128, "acc_norm_stderr": 0.014302931207177386 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.7888888888888889, "acc_stderr": 0.024882116857655078, "acc_norm": 0.7888888888888889, "acc_norm_stderr": 0.024882116857655078 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.9411764705882353, "acc_stderr": 0.015283995352038426, "acc_norm": 0.9411764705882353, "acc_norm_stderr": 0.015283995352038426 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.7682119205298014, "acc_stderr": 0.03445406271987054, "acc_norm": 0.7682119205298014, "acc_norm_stderr": 0.03445406271987054 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.9743119266055046, "acc_stderr": 0.006782898624451454, "acc_norm": 0.9743119266055046, "acc_norm_stderr": 0.006782898624451454 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.8333333333333334, "acc_stderr": 0.02541642838876747, "acc_norm": 0.8333333333333334, "acc_norm_stderr": 0.02541642838876747 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.9705882352941176, "acc_stderr": 0.011858507536737417, "acc_norm": 0.9705882352941176, "acc_norm_stderr": 0.011858507536737417 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.9493670886075949, "acc_stderr": 0.014271760025370188, "acc_norm": 0.9493670886075949, "acc_norm_stderr": 0.014271760025370188 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.8923766816143498, "acc_stderr": 0.020799400082880004, "acc_norm": 0.8923766816143498, "acc_norm_stderr": 0.020799400082880004 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.9083969465648855, "acc_stderr": 0.025300035578642962, "acc_norm": 0.9083969465648855, "acc_norm_stderr": 0.025300035578642962 }, "harness|hendrycksTest-international_law|5": { "acc": 0.9173553719008265, "acc_stderr": 0.02513538235660422, "acc_norm": 0.9173553719008265, "acc_norm_stderr": 0.02513538235660422 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.9629629629629629, "acc_stderr": 0.018257067489429676, "acc_norm": 0.9629629629629629, "acc_norm_stderr": 0.018257067489429676 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.9447852760736196, "acc_stderr": 0.017944712448654636, "acc_norm": 0.9447852760736196, "acc_norm_stderr": 0.017944712448654636 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.8392857142857143, "acc_stderr": 0.034859460964757415, "acc_norm": 0.8392857142857143, "acc_norm_stderr": 0.034859460964757415 }, "harness|hendrycksTest-management|5": { "acc": 0.9611650485436893, "acc_stderr": 0.019129793517354922, "acc_norm": 0.9611650485436893, "acc_norm_stderr": 0.019129793517354922 }, "harness|hendrycksTest-marketing|5": { "acc": 0.9829059829059829, "acc_stderr": 0.008491806622565604, "acc_norm": 0.9829059829059829, "acc_norm_stderr": 0.008491806622565604 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.92, "acc_stderr": 0.027265992434429086, "acc_norm": 0.92, "acc_norm_stderr": 0.027265992434429086 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.9578544061302682, "acc_stderr": 0.007184928704935858, "acc_norm": 0.9578544061302682, "acc_norm_stderr": 0.007184928704935858 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.8872832369942196, "acc_stderr": 0.017026126074681635, "acc_norm": 0.8872832369942196, "acc_norm_stderr": 0.017026126074681635 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.9050279329608939, "acc_stderr": 0.009805284011337068, "acc_norm": 0.9050279329608939, "acc_norm_stderr": 0.009805284011337068 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.9313725490196079, "acc_stderr": 0.014476405218161428, "acc_norm": 0.9313725490196079, "acc_norm_stderr": 0.014476405218161428 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.8971061093247589, "acc_stderr": 0.017255830051445344, "acc_norm": 0.8971061093247589, "acc_norm_stderr": 0.017255830051445344 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.9197530864197531, "acc_stderr": 0.015116405542849367, "acc_norm": 0.9197530864197531, "acc_norm_stderr": 0.015116405542849367 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.7872340425531915, "acc_stderr": 0.024414612974307713, "acc_norm": 0.7872340425531915, "acc_norm_stderr": 0.024414612974307713 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.8305084745762712, "acc_stderr": 0.009582414456640188, "acc_norm": 0.8305084745762712, "acc_norm_stderr": 0.009582414456640188 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.9411764705882353, "acc_stderr": 0.014293099746606797, "acc_norm": 0.9411764705882353, "acc_norm_stderr": 0.014293099746606797 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.9019607843137255, "acc_stderr": 0.012030208014297142, "acc_norm": 0.9019607843137255, "acc_norm_stderr": 0.012030208014297142 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.8454545454545455, "acc_stderr": 0.03462262571262667, "acc_norm": 0.8454545454545455, "acc_norm_stderr": 0.03462262571262667 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.9061224489795918, "acc_stderr": 0.018671508543506656, "acc_norm": 0.9061224489795918, "acc_norm_stderr": 0.018671508543506656 }, "harness|hendrycksTest-sociology|5": { "acc": 0.9751243781094527, "acc_stderr": 0.011012907274218229, "acc_norm": 0.9751243781094527, "acc_norm_stderr": 0.011012907274218229 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.96, "acc_stderr": 0.01969463855669321, "acc_norm": 0.96, "acc_norm_stderr": 0.01969463855669321 }, "harness|hendrycksTest-virology|5": { "acc": 0.7168674698795181, "acc_stderr": 0.03507295431370519, "acc_norm": 0.7168674698795181, "acc_norm_stderr": 0.03507295431370519 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.9298245614035088, "acc_stderr": 0.019591541754525123, "acc_norm": 0.9298245614035088, "acc_norm_stderr": 0.019591541754525123 }, "harness|truthfulqa:mc|0": { "mc1": 0.36474908200734396, "mc1_stderr": 0.01685096106172011, "mc2": 0.5375985523274007, "mc2_stderr": 0.015202763451961539 }, "harness|winogrande|5": { "acc": 0.8050513022888713, "acc_stderr": 0.011134099415938263 }, "harness|gsm8k|5": { "acc": 0.535253980288097, "acc_stderr": 0.013738207990177317 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_AA051611__V0201
[ "region:us" ]
2024-02-02T03:17:30+00:00
{"pretty_name": "Evaluation run of AA051611/V0201", "dataset_summary": "Dataset automatically created during the evaluation run of model [AA051611/V0201](https://huggingface.co/AA051611/V0201) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_AA051611__V0201\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-02T03:15:18.446534](https://huggingface.co/datasets/open-llm-leaderboard/details_AA051611__V0201/blob/main/results_2024-02-02T03-15-18.446534.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.8722899277105903,\n \"acc_stderr\": 0.021779827433248626,\n \"acc_norm\": 0.8832174168880055,\n \"acc_norm_stderr\": 0.022071903413890245,\n \"mc1\": 0.36474908200734396,\n \"mc1_stderr\": 0.01685096106172011,\n \"mc2\": 0.5375985523274007,\n \"mc2_stderr\": 0.015202763451961539\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6339590443686007,\n \"acc_stderr\": 0.014077223108470139,\n \"acc_norm\": 0.6723549488054608,\n \"acc_norm_stderr\": 0.013715847940719337\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6309500099581756,\n \"acc_stderr\": 0.004815613144385407,\n \"acc_norm\": 0.8330013941445927,\n \"acc_norm_stderr\": 0.00372212370961046\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.837037037037037,\n \"acc_stderr\": 0.03190541474482841,\n \"acc_norm\": 0.837037037037037,\n \"acc_norm_stderr\": 0.03190541474482841\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.9539473684210527,\n \"acc_stderr\": 0.01705693362806048,\n \"acc_norm\": 0.9539473684210527,\n \"acc_norm_stderr\": 0.01705693362806048\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.86,\n \"acc_stderr\": 0.0348735088019777,\n \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.0348735088019777\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.9358490566037736,\n \"acc_stderr\": 0.015080038966069792,\n \"acc_norm\": 0.9358490566037736,\n \"acc_norm_stderr\": 0.015080038966069792\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.9652777777777778,\n \"acc_stderr\": 0.01530953117500374,\n \"acc_norm\": 0.9652777777777778,\n \"acc_norm_stderr\": 0.01530953117500374\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.86,\n \"acc_stderr\": 0.03487350880197772,\n \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.03487350880197772\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.78,\n \"acc_stderr\": 0.04163331998932262,\n \"acc_norm\": 0.78,\n \"acc_norm_stderr\": 0.04163331998932262\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.8786127167630058,\n \"acc_stderr\": 0.024901248066383764,\n \"acc_norm\": 0.8786127167630058,\n \"acc_norm_stderr\": 0.024901248066383764\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.7745098039215687,\n \"acc_stderr\": 0.04158307533083286,\n \"acc_norm\": 0.7745098039215687,\n \"acc_norm_stderr\": 0.04158307533083286\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.91,\n \"acc_stderr\": 0.028762349126466115,\n \"acc_norm\": 0.91,\n \"acc_norm_stderr\": 0.028762349126466115\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.8936170212765957,\n \"acc_stderr\": 0.02015597730704985,\n \"acc_norm\": 0.8936170212765957,\n \"acc_norm_stderr\": 0.02015597730704985\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.7894736842105263,\n \"acc_stderr\": 0.0383515395439942,\n \"acc_norm\": 0.7894736842105263,\n \"acc_norm_stderr\": 0.0383515395439942\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.8896551724137931,\n \"acc_stderr\": 0.026109923428966807,\n \"acc_norm\": 0.8896551724137931,\n \"acc_norm_stderr\": 0.026109923428966807\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.8862433862433863,\n \"acc_stderr\": 0.016352876480494796,\n \"acc_norm\": 0.8862433862433863,\n \"acc_norm_stderr\": 0.016352876480494796\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.7301587301587301,\n \"acc_stderr\": 0.03970158273235171,\n \"acc_norm\": 0.7301587301587301,\n \"acc_norm_stderr\": 0.03970158273235171\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.84,\n \"acc_stderr\": 0.036845294917747115,\n \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.036845294917747115\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.9612903225806452,\n \"acc_stderr\": 0.010973819726797958,\n \"acc_norm\": 0.9612903225806452,\n \"acc_norm_stderr\": 0.010973819726797958\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.8078817733990148,\n \"acc_stderr\": 0.02771931570961478,\n \"acc_norm\": 0.8078817733990148,\n \"acc_norm_stderr\": 0.02771931570961478\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.91,\n \"acc_stderr\": 0.028762349126466115,\n \"acc_norm\": 0.91,\n \"acc_norm_stderr\": 0.028762349126466115\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.9212121212121213,\n \"acc_stderr\": 0.021037183825716357,\n \"acc_norm\": 0.9212121212121213,\n \"acc_norm_stderr\": 0.021037183825716357\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.9646464646464646,\n \"acc_stderr\": 0.01315731887804608,\n \"acc_norm\": 0.9646464646464646,\n \"acc_norm_stderr\": 0.01315731887804608\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9844559585492227,\n \"acc_stderr\": 0.008927492715084346,\n \"acc_norm\": 0.9844559585492227,\n \"acc_norm_stderr\": 0.008927492715084346\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.9128205128205128,\n \"acc_stderr\": 0.014302931207177386,\n \"acc_norm\": 0.9128205128205128,\n \"acc_norm_stderr\": 0.014302931207177386\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.7888888888888889,\n \"acc_stderr\": 0.024882116857655078,\n \"acc_norm\": 0.7888888888888889,\n \"acc_norm_stderr\": 0.024882116857655078\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.9411764705882353,\n \"acc_stderr\": 0.015283995352038426,\n \"acc_norm\": 0.9411764705882353,\n \"acc_norm_stderr\": 0.015283995352038426\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.7682119205298014,\n \"acc_stderr\": 0.03445406271987054,\n \"acc_norm\": 0.7682119205298014,\n \"acc_norm_stderr\": 0.03445406271987054\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.9743119266055046,\n \"acc_stderr\": 0.006782898624451454,\n \"acc_norm\": 0.9743119266055046,\n \"acc_norm_stderr\": 0.006782898624451454\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.8333333333333334,\n \"acc_stderr\": 0.02541642838876747,\n \"acc_norm\": 0.8333333333333334,\n \"acc_norm_stderr\": 0.02541642838876747\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.9705882352941176,\n \"acc_stderr\": 0.011858507536737417,\n \"acc_norm\": 0.9705882352941176,\n \"acc_norm_stderr\": 0.011858507536737417\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.9493670886075949,\n \"acc_stderr\": 0.014271760025370188,\n \"acc_norm\": 0.9493670886075949,\n \"acc_norm_stderr\": 0.014271760025370188\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.8923766816143498,\n \"acc_stderr\": 0.020799400082880004,\n \"acc_norm\": 0.8923766816143498,\n \"acc_norm_stderr\": 0.020799400082880004\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.9083969465648855,\n \"acc_stderr\": 0.025300035578642962,\n \"acc_norm\": 0.9083969465648855,\n \"acc_norm_stderr\": 0.025300035578642962\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.9173553719008265,\n \"acc_stderr\": 0.02513538235660422,\n \"acc_norm\": 0.9173553719008265,\n \"acc_norm_stderr\": 0.02513538235660422\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.9629629629629629,\n \"acc_stderr\": 0.018257067489429676,\n \"acc_norm\": 0.9629629629629629,\n \"acc_norm_stderr\": 0.018257067489429676\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.9447852760736196,\n \"acc_stderr\": 0.017944712448654636,\n \"acc_norm\": 0.9447852760736196,\n \"acc_norm_stderr\": 0.017944712448654636\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.8392857142857143,\n \"acc_stderr\": 0.034859460964757415,\n \"acc_norm\": 0.8392857142857143,\n \"acc_norm_stderr\": 0.034859460964757415\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.9611650485436893,\n \"acc_stderr\": 0.019129793517354922,\n \"acc_norm\": 0.9611650485436893,\n \"acc_norm_stderr\": 0.019129793517354922\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9829059829059829,\n \"acc_stderr\": 0.008491806622565604,\n \"acc_norm\": 0.9829059829059829,\n \"acc_norm_stderr\": 0.008491806622565604\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.92,\n \"acc_stderr\": 0.027265992434429086,\n \"acc_norm\": 0.92,\n \"acc_norm_stderr\": 0.027265992434429086\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.9578544061302682,\n \"acc_stderr\": 0.007184928704935858,\n \"acc_norm\": 0.9578544061302682,\n \"acc_norm_stderr\": 0.007184928704935858\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.8872832369942196,\n \"acc_stderr\": 0.017026126074681635,\n \"acc_norm\": 0.8872832369942196,\n \"acc_norm_stderr\": 0.017026126074681635\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.9050279329608939,\n \"acc_stderr\": 0.009805284011337068,\n \"acc_norm\": 0.9050279329608939,\n \"acc_norm_stderr\": 0.009805284011337068\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.9313725490196079,\n \"acc_stderr\": 0.014476405218161428,\n \"acc_norm\": 0.9313725490196079,\n \"acc_norm_stderr\": 0.014476405218161428\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.8971061093247589,\n \"acc_stderr\": 0.017255830051445344,\n \"acc_norm\": 0.8971061093247589,\n \"acc_norm_stderr\": 0.017255830051445344\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.9197530864197531,\n \"acc_stderr\": 0.015116405542849367,\n \"acc_norm\": 0.9197530864197531,\n \"acc_norm_stderr\": 0.015116405542849367\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.7872340425531915,\n \"acc_stderr\": 0.024414612974307713,\n \"acc_norm\": 0.7872340425531915,\n \"acc_norm_stderr\": 0.024414612974307713\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.8305084745762712,\n \"acc_stderr\": 0.009582414456640188,\n \"acc_norm\": 0.8305084745762712,\n \"acc_norm_stderr\": 0.009582414456640188\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.9411764705882353,\n \"acc_stderr\": 0.014293099746606797,\n \"acc_norm\": 0.9411764705882353,\n \"acc_norm_stderr\": 0.014293099746606797\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.9019607843137255,\n \"acc_stderr\": 0.012030208014297142,\n \"acc_norm\": 0.9019607843137255,\n \"acc_norm_stderr\": 0.012030208014297142\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.8454545454545455,\n \"acc_stderr\": 0.03462262571262667,\n \"acc_norm\": 0.8454545454545455,\n \"acc_norm_stderr\": 0.03462262571262667\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.9061224489795918,\n \"acc_stderr\": 0.018671508543506656,\n \"acc_norm\": 0.9061224489795918,\n \"acc_norm_stderr\": 0.018671508543506656\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.9751243781094527,\n \"acc_stderr\": 0.011012907274218229,\n \"acc_norm\": 0.9751243781094527,\n \"acc_norm_stderr\": 0.011012907274218229\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.96,\n \"acc_stderr\": 0.01969463855669321,\n \"acc_norm\": 0.96,\n \"acc_norm_stderr\": 0.01969463855669321\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.7168674698795181,\n \"acc_stderr\": 0.03507295431370519,\n \"acc_norm\": 0.7168674698795181,\n \"acc_norm_stderr\": 0.03507295431370519\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.9298245614035088,\n \"acc_stderr\": 0.019591541754525123,\n \"acc_norm\": 0.9298245614035088,\n \"acc_norm_stderr\": 0.019591541754525123\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.36474908200734396,\n \"mc1_stderr\": 0.01685096106172011,\n \"mc2\": 0.5375985523274007,\n \"mc2_stderr\": 0.015202763451961539\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8050513022888713,\n \"acc_stderr\": 0.011134099415938263\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.535253980288097,\n \"acc_stderr\": 0.013738207990177317\n }\n}\n```", "repo_url": "https://huggingface.co/AA051611/V0201", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_02T03_15_18.446534", "path": ["**/details_harness|arc:challenge|25_2024-02-02T03-15-18.446534.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-02T03-15-18.446534.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_02T03_15_18.446534", "path": ["**/details_harness|gsm8k|5_2024-02-02T03-15-18.446534.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-02T03-15-18.446534.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_02T03_15_18.446534", "path": ["**/details_harness|hellaswag|10_2024-02-02T03-15-18.446534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-02T03-15-18.446534.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_02T03_15_18.446534", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T03-15-18.446534.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-02T03-15-18.446534.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-02T03-15-18.446534.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T03-15-18.446534.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T03-15-18.446534.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-02T03-15-18.446534.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T03-15-18.446534.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T03-15-18.446534.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T03-15-18.446534.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T03-15-18.446534.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-02T03-15-18.446534.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-02T03-15-18.446534.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T03-15-18.446534.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-02T03-15-18.446534.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T03-15-18.446534.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T03-15-18.446534.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T03-15-18.446534.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-02T03-15-18.446534.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T03-15-18.446534.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T03-15-18.446534.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T03-15-18.446534.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T03-15-18.446534.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T03-15-18.446534.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T03-15-18.446534.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T03-15-18.446534.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T03-15-18.446534.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T03-15-18.446534.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T03-15-18.446534.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T03-15-18.446534.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T03-15-18.446534.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T03-15-18.446534.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T03-15-18.446534.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-02T03-15-18.446534.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T03-15-18.446534.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-02T03-15-18.446534.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T03-15-18.446534.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T03-15-18.446534.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T03-15-18.446534.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-02T03-15-18.446534.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-02T03-15-18.446534.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T03-15-18.446534.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T03-15-18.446534.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T03-15-18.446534.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T03-15-18.446534.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-02T03-15-18.446534.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-02T03-15-18.446534.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-02T03-15-18.446534.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T03-15-18.446534.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-02T03-15-18.446534.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T03-15-18.446534.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T03-15-18.446534.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-02T03-15-18.446534.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-02T03-15-18.446534.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-02T03-15-18.446534.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T03-15-18.446534.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-02T03-15-18.446534.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-02T03-15-18.446534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T03-15-18.446534.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-02T03-15-18.446534.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-02T03-15-18.446534.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T03-15-18.446534.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T03-15-18.446534.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-02T03-15-18.446534.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T03-15-18.446534.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T03-15-18.446534.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T03-15-18.446534.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T03-15-18.446534.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-02T03-15-18.446534.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-02T03-15-18.446534.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T03-15-18.446534.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-02T03-15-18.446534.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T03-15-18.446534.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T03-15-18.446534.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T03-15-18.446534.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-02T03-15-18.446534.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T03-15-18.446534.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T03-15-18.446534.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T03-15-18.446534.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T03-15-18.446534.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T03-15-18.446534.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T03-15-18.446534.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T03-15-18.446534.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T03-15-18.446534.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T03-15-18.446534.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T03-15-18.446534.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T03-15-18.446534.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T03-15-18.446534.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T03-15-18.446534.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T03-15-18.446534.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-02T03-15-18.446534.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T03-15-18.446534.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-02T03-15-18.446534.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T03-15-18.446534.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T03-15-18.446534.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T03-15-18.446534.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-02T03-15-18.446534.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-02T03-15-18.446534.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T03-15-18.446534.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T03-15-18.446534.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T03-15-18.446534.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T03-15-18.446534.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-02T03-15-18.446534.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-02T03-15-18.446534.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-02T03-15-18.446534.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T03-15-18.446534.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-02T03-15-18.446534.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T03-15-18.446534.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T03-15-18.446534.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-02T03-15-18.446534.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-02T03-15-18.446534.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-02T03-15-18.446534.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T03-15-18.446534.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-02T03-15-18.446534.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-02T03-15-18.446534.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_02T03_15_18.446534", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T03-15-18.446534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T03-15-18.446534.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_02T03_15_18.446534", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-02T03-15-18.446534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-02T03-15-18.446534.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_02T03_15_18.446534", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-02T03-15-18.446534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-02T03-15-18.446534.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_02T03_15_18.446534", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T03-15-18.446534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T03-15-18.446534.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_02T03_15_18.446534", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T03-15-18.446534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T03-15-18.446534.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_02T03_15_18.446534", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-02T03-15-18.446534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-02T03-15-18.446534.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_02T03_15_18.446534", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T03-15-18.446534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T03-15-18.446534.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_02T03_15_18.446534", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T03-15-18.446534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T03-15-18.446534.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_02T03_15_18.446534", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T03-15-18.446534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T03-15-18.446534.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_02T03_15_18.446534", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T03-15-18.446534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T03-15-18.446534.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_02T03_15_18.446534", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-02T03-15-18.446534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-02T03-15-18.446534.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_02T03_15_18.446534", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-02T03-15-18.446534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-02T03-15-18.446534.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_02T03_15_18.446534", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T03-15-18.446534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T03-15-18.446534.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_02T03_15_18.446534", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-02T03-15-18.446534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-02T03-15-18.446534.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_02T03_15_18.446534", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T03-15-18.446534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T03-15-18.446534.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_02T03_15_18.446534", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T03-15-18.446534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T03-15-18.446534.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_02T03_15_18.446534", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T03-15-18.446534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T03-15-18.446534.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_02T03_15_18.446534", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-02T03-15-18.446534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-02T03-15-18.446534.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_02T03_15_18.446534", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T03-15-18.446534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T03-15-18.446534.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_02T03_15_18.446534", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T03-15-18.446534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T03-15-18.446534.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_02T03_15_18.446534", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T03-15-18.446534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T03-15-18.446534.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_02T03_15_18.446534", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T03-15-18.446534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T03-15-18.446534.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_02T03_15_18.446534", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T03-15-18.446534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T03-15-18.446534.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_02T03_15_18.446534", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T03-15-18.446534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T03-15-18.446534.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_02T03_15_18.446534", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T03-15-18.446534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T03-15-18.446534.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_02T03_15_18.446534", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T03-15-18.446534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T03-15-18.446534.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_02T03_15_18.446534", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T03-15-18.446534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T03-15-18.446534.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_02T03_15_18.446534", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T03-15-18.446534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T03-15-18.446534.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_02T03_15_18.446534", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T03-15-18.446534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T03-15-18.446534.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_02T03_15_18.446534", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T03-15-18.446534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T03-15-18.446534.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_02T03_15_18.446534", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T03-15-18.446534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T03-15-18.446534.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_02T03_15_18.446534", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T03-15-18.446534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T03-15-18.446534.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_02T03_15_18.446534", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-02T03-15-18.446534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-02T03-15-18.446534.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_02T03_15_18.446534", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T03-15-18.446534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T03-15-18.446534.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_02T03_15_18.446534", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-02T03-15-18.446534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-02T03-15-18.446534.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_02T03_15_18.446534", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T03-15-18.446534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T03-15-18.446534.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_02T03_15_18.446534", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T03-15-18.446534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T03-15-18.446534.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_02T03_15_18.446534", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T03-15-18.446534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T03-15-18.446534.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_02T03_15_18.446534", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-02T03-15-18.446534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-02T03-15-18.446534.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_02T03_15_18.446534", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-02T03-15-18.446534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-02T03-15-18.446534.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_02T03_15_18.446534", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T03-15-18.446534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T03-15-18.446534.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_02T03_15_18.446534", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T03-15-18.446534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T03-15-18.446534.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_02T03_15_18.446534", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T03-15-18.446534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T03-15-18.446534.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_02T03_15_18.446534", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T03-15-18.446534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T03-15-18.446534.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_02T03_15_18.446534", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-02T03-15-18.446534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-02T03-15-18.446534.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_02T03_15_18.446534", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-02T03-15-18.446534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-02T03-15-18.446534.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_02T03_15_18.446534", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-02T03-15-18.446534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-02T03-15-18.446534.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_02T03_15_18.446534", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T03-15-18.446534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T03-15-18.446534.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_02T03_15_18.446534", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-02T03-15-18.446534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-02T03-15-18.446534.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_02T03_15_18.446534", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T03-15-18.446534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T03-15-18.446534.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_02T03_15_18.446534", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T03-15-18.446534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T03-15-18.446534.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_02T03_15_18.446534", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-02T03-15-18.446534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-02T03-15-18.446534.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_02T03_15_18.446534", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-02T03-15-18.446534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-02T03-15-18.446534.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_02T03_15_18.446534", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-02T03-15-18.446534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-02T03-15-18.446534.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_02T03_15_18.446534", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T03-15-18.446534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T03-15-18.446534.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_02T03_15_18.446534", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-02T03-15-18.446534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-02T03-15-18.446534.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_02T03_15_18.446534", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-02T03-15-18.446534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-02T03-15-18.446534.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_02T03_15_18.446534", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-02T03-15-18.446534.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-02T03-15-18.446534.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_02T03_15_18.446534", "path": ["**/details_harness|winogrande|5_2024-02-02T03-15-18.446534.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-02T03-15-18.446534.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_02T03_15_18.446534", "path": ["results_2024-02-02T03-15-18.446534.parquet"]}, {"split": "latest", "path": ["results_2024-02-02T03-15-18.446534.parquet"]}]}]}
2024-02-02T03:18:00+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of AA051611/V0201 Dataset automatically created during the evaluation run of model AA051611/V0201 on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-02-02T03:15:18.446534(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of AA051611/V0201\n\n\n\nDataset automatically created during the evaluation run of model AA051611/V0201 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-02-02T03:15:18.446534(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of AA051611/V0201\n\n\n\nDataset automatically created during the evaluation run of model AA051611/V0201 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-02-02T03:15:18.446534(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
eceb3db234d7e1eb0451f167e9f311607cc409c6
# Dataset Card for Evaluation run of AA051611/O0201 <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [AA051611/O0201](https://huggingface.co/AA051611/O0201) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_AA051611__O0201", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-02-02T03:15:57.040098](https://huggingface.co/datasets/open-llm-leaderboard/details_AA051611__O0201/blob/main/results_2024-02-02T03-15-57.040098.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.8785205668188928, "acc_stderr": 0.02108960224655999, "acc_norm": 0.8890480079512375, "acc_norm_stderr": 0.021357047298961734, "mc1": 0.40024479804161567, "mc1_stderr": 0.017151605555749138, "mc2": 0.5863202491791143, "mc2_stderr": 0.015280659551121102 }, "harness|arc:challenge|25": { "acc": 0.6476109215017065, "acc_stderr": 0.013960142600598685, "acc_norm": 0.6783276450511946, "acc_norm_stderr": 0.013650488084494162 }, "harness|hellaswag|10": { "acc": 0.6456881099382593, "acc_stderr": 0.004773267510112743, "acc_norm": 0.844851623182633, "acc_norm_stderr": 0.0036130615166899793 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.72, "acc_stderr": 0.045126085985421276, "acc_norm": 0.72, "acc_norm_stderr": 0.045126085985421276 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.8666666666666667, "acc_stderr": 0.029365879728106854, "acc_norm": 0.8666666666666667, "acc_norm_stderr": 0.029365879728106854 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.9276315789473685, "acc_stderr": 0.021085011261884105, "acc_norm": 0.9276315789473685, "acc_norm_stderr": 0.021085011261884105 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.84, "acc_stderr": 0.03684529491774711, "acc_norm": 0.84, "acc_norm_stderr": 0.03684529491774711 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.9283018867924528, "acc_stderr": 0.015878026288737926, "acc_norm": 0.9283018867924528, "acc_norm_stderr": 0.015878026288737926 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.9513888888888888, "acc_stderr": 0.017983689383153575, "acc_norm": 0.9513888888888888, "acc_norm_stderr": 0.017983689383153575 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.78, "acc_stderr": 0.04163331998932261, "acc_norm": 0.78, "acc_norm_stderr": 0.04163331998932261 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.84, "acc_stderr": 0.03684529491774711, "acc_norm": 0.84, "acc_norm_stderr": 0.03684529491774711 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.76, "acc_stderr": 0.042923469599092816, "acc_norm": 0.76, "acc_norm_stderr": 0.042923469599092816 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.8670520231213873, "acc_stderr": 0.025888042979662292, "acc_norm": 0.8670520231213873, "acc_norm_stderr": 0.025888042979662292 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.7549019607843137, "acc_stderr": 0.04280105837364395, "acc_norm": 0.7549019607843137, "acc_norm_stderr": 0.04280105837364395 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.91, "acc_stderr": 0.028762349126466108, "acc_norm": 0.91, "acc_norm_stderr": 0.028762349126466108 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.9234042553191489, "acc_stderr": 0.017385625826369294, "acc_norm": 0.9234042553191489, "acc_norm_stderr": 0.017385625826369294 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.8421052631578947, "acc_stderr": 0.03430265978485699, "acc_norm": 0.8421052631578947, "acc_norm_stderr": 0.03430265978485699 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.8827586206896552, "acc_stderr": 0.026808974229173797, "acc_norm": 0.8827586206896552, "acc_norm_stderr": 0.026808974229173797 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.8677248677248677, "acc_stderr": 0.01744855429068043, "acc_norm": 0.8677248677248677, "acc_norm_stderr": 0.01744855429068043 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.6904761904761905, "acc_stderr": 0.04134913018303318, "acc_norm": 0.6904761904761905, "acc_norm_stderr": 0.04134913018303318 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.74, "acc_stderr": 0.04408440022768078, "acc_norm": 0.74, "acc_norm_stderr": 0.04408440022768078 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.9709677419354839, "acc_stderr": 0.00955132381346253, "acc_norm": 0.9709677419354839, "acc_norm_stderr": 0.00955132381346253 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.8374384236453202, "acc_stderr": 0.025960300064605587, "acc_norm": 0.8374384236453202, "acc_norm_stderr": 0.025960300064605587 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.94, "acc_stderr": 0.02386832565759418, "acc_norm": 0.94, "acc_norm_stderr": 0.02386832565759418 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.9515151515151515, "acc_stderr": 0.016772158250856272, "acc_norm": 0.9515151515151515, "acc_norm_stderr": 0.016772158250856272 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.98989898989899, "acc_stderr": 0.0071243415212508135, "acc_norm": 0.98989898989899, "acc_norm_stderr": 0.0071243415212508135 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.9844559585492227, "acc_stderr": 0.008927492715084346, "acc_norm": 0.9844559585492227, "acc_norm_stderr": 0.008927492715084346 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.9230769230769231, "acc_stderr": 0.013510532610273879, "acc_norm": 0.9230769230769231, "acc_norm_stderr": 0.013510532610273879 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.8074074074074075, "acc_stderr": 0.02404307518194519, "acc_norm": 0.8074074074074075, "acc_norm_stderr": 0.02404307518194519 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.9495798319327731, "acc_stderr": 0.01421326039188437, "acc_norm": 0.9495798319327731, "acc_norm_stderr": 0.01421326039188437 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.7947019867549668, "acc_stderr": 0.03297986648473835, "acc_norm": 0.7947019867549668, "acc_norm_stderr": 0.03297986648473835 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.9688073394495413, "acc_stderr": 0.00745324624278531, "acc_norm": 0.9688073394495413, "acc_norm_stderr": 0.00745324624278531 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.8703703703703703, "acc_stderr": 0.022907883151288604, "acc_norm": 0.8703703703703703, "acc_norm_stderr": 0.022907883151288604 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.9852941176470589, "acc_stderr": 0.008448516754761201, "acc_norm": 0.9852941176470589, "acc_norm_stderr": 0.008448516754761201 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.9662447257383966, "acc_stderr": 0.011755967781486706, "acc_norm": 0.9662447257383966, "acc_norm_stderr": 0.011755967781486706 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.9237668161434978, "acc_stderr": 0.017810524970082207, "acc_norm": 0.9237668161434978, "acc_norm_stderr": 0.017810524970082207 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.9465648854961832, "acc_stderr": 0.01972499449971275, "acc_norm": 0.9465648854961832, "acc_norm_stderr": 0.01972499449971275 }, "harness|hendrycksTest-international_law|5": { "acc": 0.9256198347107438, "acc_stderr": 0.023952688836676752, "acc_norm": 0.9256198347107438, "acc_norm_stderr": 0.023952688836676752 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.9629629629629629, "acc_stderr": 0.01825706748942968, "acc_norm": 0.9629629629629629, "acc_norm_stderr": 0.01825706748942968 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.9754601226993865, "acc_stderr": 0.012155797205267207, "acc_norm": 0.9754601226993865, "acc_norm_stderr": 0.012155797205267207 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.8392857142857143, "acc_stderr": 0.034859460964757415, "acc_norm": 0.8392857142857143, "acc_norm_stderr": 0.034859460964757415 }, "harness|hendrycksTest-management|5": { "acc": 0.9611650485436893, "acc_stderr": 0.01912979351735493, "acc_norm": 0.9611650485436893, "acc_norm_stderr": 0.01912979351735493 }, "harness|hendrycksTest-marketing|5": { "acc": 0.9829059829059829, "acc_stderr": 0.008491806622565629, "acc_norm": 0.9829059829059829, "acc_norm_stderr": 0.008491806622565629 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.91, "acc_stderr": 0.028762349126466143, "acc_norm": 0.91, "acc_norm_stderr": 0.028762349126466143 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.9553001277139208, "acc_stderr": 0.007389578763460815, "acc_norm": 0.9553001277139208, "acc_norm_stderr": 0.007389578763460815 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.8959537572254336, "acc_stderr": 0.016437904423993795, "acc_norm": 0.8959537572254336, "acc_norm_stderr": 0.016437904423993795 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.9016759776536313, "acc_stderr": 0.009958325296075616, "acc_norm": 0.9016759776536313, "acc_norm_stderr": 0.009958325296075616 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.9411764705882353, "acc_stderr": 0.013472901983855722, "acc_norm": 0.9411764705882353, "acc_norm_stderr": 0.013472901983855722 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.9260450160771704, "acc_stderr": 0.014863426023220673, "acc_norm": 0.9260450160771704, "acc_norm_stderr": 0.014863426023220673 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.9228395061728395, "acc_stderr": 0.014847704893944923, "acc_norm": 0.9228395061728395, "acc_norm_stderr": 0.014847704893944923 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.8085106382978723, "acc_stderr": 0.023472645247949443, "acc_norm": 0.8085106382978723, "acc_norm_stderr": 0.023472645247949443 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.8761408083441982, "acc_stderr": 0.008413563905877691, "acc_norm": 0.8761408083441982, "acc_norm_stderr": 0.008413563905877691 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.9522058823529411, "acc_stderr": 0.012958896125913097, "acc_norm": 0.9522058823529411, "acc_norm_stderr": 0.012958896125913097 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.9101307189542484, "acc_stderr": 0.011570094738536468, "acc_norm": 0.9101307189542484, "acc_norm_stderr": 0.011570094738536468 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.8727272727272727, "acc_stderr": 0.03192226512468566, "acc_norm": 0.8727272727272727, "acc_norm_stderr": 0.03192226512468566 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.926530612244898, "acc_stderr": 0.016702757799433237, "acc_norm": 0.926530612244898, "acc_norm_stderr": 0.016702757799433237 }, "harness|hendrycksTest-sociology|5": { "acc": 0.9651741293532339, "acc_stderr": 0.012963994249547642, "acc_norm": 0.9651741293532339, "acc_norm_stderr": 0.012963994249547642 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.97, "acc_stderr": 0.01714466079977652, "acc_norm": 0.97, "acc_norm_stderr": 0.01714466079977652 }, "harness|hendrycksTest-virology|5": { "acc": 0.7048192771084337, "acc_stderr": 0.0355092018568963, "acc_norm": 0.7048192771084337, "acc_norm_stderr": 0.0355092018568963 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.9415204678362573, "acc_stderr": 0.017996678857280134, "acc_norm": 0.9415204678362573, "acc_norm_stderr": 0.017996678857280134 }, "harness|truthfulqa:mc|0": { "mc1": 0.40024479804161567, "mc1_stderr": 0.017151605555749138, "mc2": 0.5863202491791143, "mc2_stderr": 0.015280659551121102 }, "harness|winogrande|5": { "acc": 0.797947908445146, "acc_stderr": 0.011285013754047451 }, "harness|gsm8k|5": { "acc": 0.5678544351781653, "acc_stderr": 0.013645072137842447 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_AA051611__O0201
[ "region:us" ]
2024-02-02T03:18:05+00:00
{"pretty_name": "Evaluation run of AA051611/O0201", "dataset_summary": "Dataset automatically created during the evaluation run of model [AA051611/O0201](https://huggingface.co/AA051611/O0201) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_AA051611__O0201\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-02T03:15:57.040098](https://huggingface.co/datasets/open-llm-leaderboard/details_AA051611__O0201/blob/main/results_2024-02-02T03-15-57.040098.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.8785205668188928,\n \"acc_stderr\": 0.02108960224655999,\n \"acc_norm\": 0.8890480079512375,\n \"acc_norm_stderr\": 0.021357047298961734,\n \"mc1\": 0.40024479804161567,\n \"mc1_stderr\": 0.017151605555749138,\n \"mc2\": 0.5863202491791143,\n \"mc2_stderr\": 0.015280659551121102\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6476109215017065,\n \"acc_stderr\": 0.013960142600598685,\n \"acc_norm\": 0.6783276450511946,\n \"acc_norm_stderr\": 0.013650488084494162\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6456881099382593,\n \"acc_stderr\": 0.004773267510112743,\n \"acc_norm\": 0.844851623182633,\n \"acc_norm_stderr\": 0.0036130615166899793\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.8666666666666667,\n \"acc_stderr\": 0.029365879728106854,\n \"acc_norm\": 0.8666666666666667,\n \"acc_norm_stderr\": 0.029365879728106854\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.9276315789473685,\n \"acc_stderr\": 0.021085011261884105,\n \"acc_norm\": 0.9276315789473685,\n \"acc_norm_stderr\": 0.021085011261884105\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774711,\n \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774711\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.9283018867924528,\n \"acc_stderr\": 0.015878026288737926,\n \"acc_norm\": 0.9283018867924528,\n \"acc_norm_stderr\": 0.015878026288737926\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.9513888888888888,\n \"acc_stderr\": 0.017983689383153575,\n \"acc_norm\": 0.9513888888888888,\n \"acc_norm_stderr\": 0.017983689383153575\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.78,\n \"acc_stderr\": 0.04163331998932261,\n \"acc_norm\": 0.78,\n \"acc_norm_stderr\": 0.04163331998932261\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774711,\n \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774711\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.8670520231213873,\n \"acc_stderr\": 0.025888042979662292,\n \"acc_norm\": 0.8670520231213873,\n \"acc_norm_stderr\": 0.025888042979662292\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.7549019607843137,\n \"acc_stderr\": 0.04280105837364395,\n \"acc_norm\": 0.7549019607843137,\n \"acc_norm_stderr\": 0.04280105837364395\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.91,\n \"acc_stderr\": 0.028762349126466108,\n \"acc_norm\": 0.91,\n \"acc_norm_stderr\": 0.028762349126466108\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.9234042553191489,\n \"acc_stderr\": 0.017385625826369294,\n \"acc_norm\": 0.9234042553191489,\n \"acc_norm_stderr\": 0.017385625826369294\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.8421052631578947,\n \"acc_stderr\": 0.03430265978485699,\n \"acc_norm\": 0.8421052631578947,\n \"acc_norm_stderr\": 0.03430265978485699\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.8827586206896552,\n \"acc_stderr\": 0.026808974229173797,\n \"acc_norm\": 0.8827586206896552,\n \"acc_norm_stderr\": 0.026808974229173797\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.8677248677248677,\n \"acc_stderr\": 0.01744855429068043,\n \"acc_norm\": 0.8677248677248677,\n \"acc_norm_stderr\": 0.01744855429068043\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.6904761904761905,\n \"acc_stderr\": 0.04134913018303318,\n \"acc_norm\": 0.6904761904761905,\n \"acc_norm_stderr\": 0.04134913018303318\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.9709677419354839,\n \"acc_stderr\": 0.00955132381346253,\n \"acc_norm\": 0.9709677419354839,\n \"acc_norm_stderr\": 0.00955132381346253\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.8374384236453202,\n \"acc_stderr\": 0.025960300064605587,\n \"acc_norm\": 0.8374384236453202,\n \"acc_norm_stderr\": 0.025960300064605587\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.94,\n \"acc_stderr\": 0.02386832565759418,\n \"acc_norm\": 0.94,\n \"acc_norm_stderr\": 0.02386832565759418\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.9515151515151515,\n \"acc_stderr\": 0.016772158250856272,\n \"acc_norm\": 0.9515151515151515,\n \"acc_norm_stderr\": 0.016772158250856272\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.98989898989899,\n \"acc_stderr\": 0.0071243415212508135,\n \"acc_norm\": 0.98989898989899,\n \"acc_norm_stderr\": 0.0071243415212508135\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9844559585492227,\n \"acc_stderr\": 0.008927492715084346,\n \"acc_norm\": 0.9844559585492227,\n \"acc_norm_stderr\": 0.008927492715084346\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.9230769230769231,\n \"acc_stderr\": 0.013510532610273879,\n \"acc_norm\": 0.9230769230769231,\n \"acc_norm_stderr\": 0.013510532610273879\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.8074074074074075,\n \"acc_stderr\": 0.02404307518194519,\n \"acc_norm\": 0.8074074074074075,\n \"acc_norm_stderr\": 0.02404307518194519\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.9495798319327731,\n \"acc_stderr\": 0.01421326039188437,\n \"acc_norm\": 0.9495798319327731,\n \"acc_norm_stderr\": 0.01421326039188437\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.7947019867549668,\n \"acc_stderr\": 0.03297986648473835,\n \"acc_norm\": 0.7947019867549668,\n \"acc_norm_stderr\": 0.03297986648473835\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.9688073394495413,\n \"acc_stderr\": 0.00745324624278531,\n \"acc_norm\": 0.9688073394495413,\n \"acc_norm_stderr\": 0.00745324624278531\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.8703703703703703,\n \"acc_stderr\": 0.022907883151288604,\n \"acc_norm\": 0.8703703703703703,\n \"acc_norm_stderr\": 0.022907883151288604\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.9852941176470589,\n \"acc_stderr\": 0.008448516754761201,\n \"acc_norm\": 0.9852941176470589,\n \"acc_norm_stderr\": 0.008448516754761201\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.9662447257383966,\n \"acc_stderr\": 0.011755967781486706,\n \"acc_norm\": 0.9662447257383966,\n \"acc_norm_stderr\": 0.011755967781486706\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.9237668161434978,\n \"acc_stderr\": 0.017810524970082207,\n \"acc_norm\": 0.9237668161434978,\n \"acc_norm_stderr\": 0.017810524970082207\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.9465648854961832,\n \"acc_stderr\": 0.01972499449971275,\n \"acc_norm\": 0.9465648854961832,\n \"acc_norm_stderr\": 0.01972499449971275\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.9256198347107438,\n \"acc_stderr\": 0.023952688836676752,\n \"acc_norm\": 0.9256198347107438,\n \"acc_norm_stderr\": 0.023952688836676752\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.9629629629629629,\n \"acc_stderr\": 0.01825706748942968,\n \"acc_norm\": 0.9629629629629629,\n \"acc_norm_stderr\": 0.01825706748942968\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.9754601226993865,\n \"acc_stderr\": 0.012155797205267207,\n \"acc_norm\": 0.9754601226993865,\n \"acc_norm_stderr\": 0.012155797205267207\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.8392857142857143,\n \"acc_stderr\": 0.034859460964757415,\n \"acc_norm\": 0.8392857142857143,\n \"acc_norm_stderr\": 0.034859460964757415\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.9611650485436893,\n \"acc_stderr\": 0.01912979351735493,\n \"acc_norm\": 0.9611650485436893,\n \"acc_norm_stderr\": 0.01912979351735493\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9829059829059829,\n \"acc_stderr\": 0.008491806622565629,\n \"acc_norm\": 0.9829059829059829,\n \"acc_norm_stderr\": 0.008491806622565629\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.91,\n \"acc_stderr\": 0.028762349126466143,\n \"acc_norm\": 0.91,\n \"acc_norm_stderr\": 0.028762349126466143\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.9553001277139208,\n \"acc_stderr\": 0.007389578763460815,\n \"acc_norm\": 0.9553001277139208,\n \"acc_norm_stderr\": 0.007389578763460815\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.8959537572254336,\n \"acc_stderr\": 0.016437904423993795,\n \"acc_norm\": 0.8959537572254336,\n \"acc_norm_stderr\": 0.016437904423993795\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.9016759776536313,\n \"acc_stderr\": 0.009958325296075616,\n \"acc_norm\": 0.9016759776536313,\n \"acc_norm_stderr\": 0.009958325296075616\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.9411764705882353,\n \"acc_stderr\": 0.013472901983855722,\n \"acc_norm\": 0.9411764705882353,\n \"acc_norm_stderr\": 0.013472901983855722\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.9260450160771704,\n \"acc_stderr\": 0.014863426023220673,\n \"acc_norm\": 0.9260450160771704,\n \"acc_norm_stderr\": 0.014863426023220673\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.9228395061728395,\n \"acc_stderr\": 0.014847704893944923,\n \"acc_norm\": 0.9228395061728395,\n \"acc_norm_stderr\": 0.014847704893944923\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.8085106382978723,\n \"acc_stderr\": 0.023472645247949443,\n \"acc_norm\": 0.8085106382978723,\n \"acc_norm_stderr\": 0.023472645247949443\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.8761408083441982,\n \"acc_stderr\": 0.008413563905877691,\n \"acc_norm\": 0.8761408083441982,\n \"acc_norm_stderr\": 0.008413563905877691\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.9522058823529411,\n \"acc_stderr\": 0.012958896125913097,\n \"acc_norm\": 0.9522058823529411,\n \"acc_norm_stderr\": 0.012958896125913097\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.9101307189542484,\n \"acc_stderr\": 0.011570094738536468,\n \"acc_norm\": 0.9101307189542484,\n \"acc_norm_stderr\": 0.011570094738536468\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.8727272727272727,\n \"acc_stderr\": 0.03192226512468566,\n \"acc_norm\": 0.8727272727272727,\n \"acc_norm_stderr\": 0.03192226512468566\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.926530612244898,\n \"acc_stderr\": 0.016702757799433237,\n \"acc_norm\": 0.926530612244898,\n \"acc_norm_stderr\": 0.016702757799433237\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.9651741293532339,\n \"acc_stderr\": 0.012963994249547642,\n \"acc_norm\": 0.9651741293532339,\n \"acc_norm_stderr\": 0.012963994249547642\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.97,\n \"acc_stderr\": 0.01714466079977652,\n \"acc_norm\": 0.97,\n \"acc_norm_stderr\": 0.01714466079977652\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.7048192771084337,\n \"acc_stderr\": 0.0355092018568963,\n \"acc_norm\": 0.7048192771084337,\n \"acc_norm_stderr\": 0.0355092018568963\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.9415204678362573,\n \"acc_stderr\": 0.017996678857280134,\n \"acc_norm\": 0.9415204678362573,\n \"acc_norm_stderr\": 0.017996678857280134\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.40024479804161567,\n \"mc1_stderr\": 0.017151605555749138,\n \"mc2\": 0.5863202491791143,\n \"mc2_stderr\": 0.015280659551121102\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.797947908445146,\n \"acc_stderr\": 0.011285013754047451\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5678544351781653,\n \"acc_stderr\": 0.013645072137842447\n }\n}\n```", "repo_url": "https://huggingface.co/AA051611/O0201", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_02T03_15_57.040098", "path": ["**/details_harness|arc:challenge|25_2024-02-02T03-15-57.040098.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-02T03-15-57.040098.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_02T03_15_57.040098", "path": ["**/details_harness|gsm8k|5_2024-02-02T03-15-57.040098.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-02T03-15-57.040098.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_02T03_15_57.040098", "path": ["**/details_harness|hellaswag|10_2024-02-02T03-15-57.040098.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-02T03-15-57.040098.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_02T03_15_57.040098", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T03-15-57.040098.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-02T03-15-57.040098.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-02T03-15-57.040098.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T03-15-57.040098.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T03-15-57.040098.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-02T03-15-57.040098.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T03-15-57.040098.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T03-15-57.040098.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T03-15-57.040098.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T03-15-57.040098.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-02T03-15-57.040098.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-02T03-15-57.040098.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T03-15-57.040098.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-02T03-15-57.040098.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T03-15-57.040098.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T03-15-57.040098.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T03-15-57.040098.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-02T03-15-57.040098.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T03-15-57.040098.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T03-15-57.040098.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T03-15-57.040098.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T03-15-57.040098.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T03-15-57.040098.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T03-15-57.040098.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T03-15-57.040098.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T03-15-57.040098.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T03-15-57.040098.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T03-15-57.040098.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T03-15-57.040098.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T03-15-57.040098.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T03-15-57.040098.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T03-15-57.040098.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-02T03-15-57.040098.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T03-15-57.040098.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-02T03-15-57.040098.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T03-15-57.040098.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T03-15-57.040098.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T03-15-57.040098.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-02T03-15-57.040098.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-02T03-15-57.040098.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T03-15-57.040098.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T03-15-57.040098.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T03-15-57.040098.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T03-15-57.040098.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-02T03-15-57.040098.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-02T03-15-57.040098.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-02T03-15-57.040098.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T03-15-57.040098.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-02T03-15-57.040098.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T03-15-57.040098.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T03-15-57.040098.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-02T03-15-57.040098.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-02T03-15-57.040098.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-02T03-15-57.040098.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T03-15-57.040098.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-02T03-15-57.040098.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-02T03-15-57.040098.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T03-15-57.040098.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-02T03-15-57.040098.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-02T03-15-57.040098.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T03-15-57.040098.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T03-15-57.040098.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-02T03-15-57.040098.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T03-15-57.040098.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T03-15-57.040098.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T03-15-57.040098.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T03-15-57.040098.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-02T03-15-57.040098.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-02T03-15-57.040098.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T03-15-57.040098.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-02T03-15-57.040098.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T03-15-57.040098.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T03-15-57.040098.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T03-15-57.040098.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-02T03-15-57.040098.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T03-15-57.040098.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T03-15-57.040098.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T03-15-57.040098.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T03-15-57.040098.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T03-15-57.040098.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T03-15-57.040098.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T03-15-57.040098.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T03-15-57.040098.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T03-15-57.040098.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T03-15-57.040098.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T03-15-57.040098.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T03-15-57.040098.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T03-15-57.040098.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T03-15-57.040098.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-02T03-15-57.040098.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T03-15-57.040098.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-02T03-15-57.040098.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T03-15-57.040098.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T03-15-57.040098.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T03-15-57.040098.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-02T03-15-57.040098.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-02T03-15-57.040098.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T03-15-57.040098.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T03-15-57.040098.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T03-15-57.040098.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T03-15-57.040098.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-02T03-15-57.040098.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-02T03-15-57.040098.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-02T03-15-57.040098.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T03-15-57.040098.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-02T03-15-57.040098.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T03-15-57.040098.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T03-15-57.040098.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-02T03-15-57.040098.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-02T03-15-57.040098.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-02T03-15-57.040098.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T03-15-57.040098.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-02T03-15-57.040098.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-02T03-15-57.040098.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_02T03_15_57.040098", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T03-15-57.040098.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T03-15-57.040098.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_02T03_15_57.040098", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-02T03-15-57.040098.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-02T03-15-57.040098.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_02T03_15_57.040098", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-02T03-15-57.040098.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-02T03-15-57.040098.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_02T03_15_57.040098", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T03-15-57.040098.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T03-15-57.040098.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_02T03_15_57.040098", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T03-15-57.040098.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T03-15-57.040098.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_02T03_15_57.040098", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-02T03-15-57.040098.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-02T03-15-57.040098.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_02T03_15_57.040098", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T03-15-57.040098.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T03-15-57.040098.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_02T03_15_57.040098", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T03-15-57.040098.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T03-15-57.040098.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_02T03_15_57.040098", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T03-15-57.040098.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T03-15-57.040098.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_02T03_15_57.040098", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T03-15-57.040098.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T03-15-57.040098.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_02T03_15_57.040098", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-02T03-15-57.040098.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-02T03-15-57.040098.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_02T03_15_57.040098", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-02T03-15-57.040098.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-02T03-15-57.040098.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_02T03_15_57.040098", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T03-15-57.040098.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T03-15-57.040098.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_02T03_15_57.040098", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-02T03-15-57.040098.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-02T03-15-57.040098.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_02T03_15_57.040098", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T03-15-57.040098.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T03-15-57.040098.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_02T03_15_57.040098", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T03-15-57.040098.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T03-15-57.040098.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_02T03_15_57.040098", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T03-15-57.040098.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T03-15-57.040098.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_02T03_15_57.040098", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-02T03-15-57.040098.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-02T03-15-57.040098.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_02T03_15_57.040098", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T03-15-57.040098.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T03-15-57.040098.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_02T03_15_57.040098", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T03-15-57.040098.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T03-15-57.040098.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_02T03_15_57.040098", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T03-15-57.040098.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T03-15-57.040098.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_02T03_15_57.040098", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T03-15-57.040098.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T03-15-57.040098.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_02T03_15_57.040098", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T03-15-57.040098.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T03-15-57.040098.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_02T03_15_57.040098", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T03-15-57.040098.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T03-15-57.040098.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_02T03_15_57.040098", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T03-15-57.040098.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T03-15-57.040098.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_02T03_15_57.040098", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T03-15-57.040098.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T03-15-57.040098.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_02T03_15_57.040098", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T03-15-57.040098.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T03-15-57.040098.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_02T03_15_57.040098", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T03-15-57.040098.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T03-15-57.040098.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_02T03_15_57.040098", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T03-15-57.040098.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T03-15-57.040098.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_02T03_15_57.040098", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T03-15-57.040098.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T03-15-57.040098.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_02T03_15_57.040098", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T03-15-57.040098.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T03-15-57.040098.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_02T03_15_57.040098", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T03-15-57.040098.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T03-15-57.040098.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_02T03_15_57.040098", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-02T03-15-57.040098.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-02T03-15-57.040098.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_02T03_15_57.040098", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T03-15-57.040098.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T03-15-57.040098.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_02T03_15_57.040098", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-02T03-15-57.040098.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-02T03-15-57.040098.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_02T03_15_57.040098", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T03-15-57.040098.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T03-15-57.040098.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_02T03_15_57.040098", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T03-15-57.040098.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T03-15-57.040098.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_02T03_15_57.040098", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T03-15-57.040098.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T03-15-57.040098.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_02T03_15_57.040098", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-02T03-15-57.040098.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-02T03-15-57.040098.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_02T03_15_57.040098", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-02T03-15-57.040098.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-02T03-15-57.040098.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_02T03_15_57.040098", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T03-15-57.040098.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T03-15-57.040098.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_02T03_15_57.040098", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T03-15-57.040098.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T03-15-57.040098.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_02T03_15_57.040098", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T03-15-57.040098.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T03-15-57.040098.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_02T03_15_57.040098", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T03-15-57.040098.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T03-15-57.040098.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_02T03_15_57.040098", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-02T03-15-57.040098.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-02T03-15-57.040098.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_02T03_15_57.040098", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-02T03-15-57.040098.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-02T03-15-57.040098.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_02T03_15_57.040098", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-02T03-15-57.040098.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-02T03-15-57.040098.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_02T03_15_57.040098", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T03-15-57.040098.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T03-15-57.040098.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_02T03_15_57.040098", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-02T03-15-57.040098.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-02T03-15-57.040098.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_02T03_15_57.040098", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T03-15-57.040098.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T03-15-57.040098.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_02T03_15_57.040098", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T03-15-57.040098.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T03-15-57.040098.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_02T03_15_57.040098", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-02T03-15-57.040098.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-02T03-15-57.040098.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_02T03_15_57.040098", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-02T03-15-57.040098.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-02T03-15-57.040098.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_02T03_15_57.040098", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-02T03-15-57.040098.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-02T03-15-57.040098.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_02T03_15_57.040098", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T03-15-57.040098.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T03-15-57.040098.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_02T03_15_57.040098", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-02T03-15-57.040098.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-02T03-15-57.040098.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_02T03_15_57.040098", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-02T03-15-57.040098.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-02T03-15-57.040098.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_02T03_15_57.040098", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-02T03-15-57.040098.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-02T03-15-57.040098.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_02T03_15_57.040098", "path": ["**/details_harness|winogrande|5_2024-02-02T03-15-57.040098.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-02T03-15-57.040098.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_02T03_15_57.040098", "path": ["results_2024-02-02T03-15-57.040098.parquet"]}, {"split": "latest", "path": ["results_2024-02-02T03-15-57.040098.parquet"]}]}]}
2024-02-02T03:18:31+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of AA051611/O0201 Dataset automatically created during the evaluation run of model AA051611/O0201 on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-02-02T03:15:57.040098(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of AA051611/O0201\n\n\n\nDataset automatically created during the evaluation run of model AA051611/O0201 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-02-02T03:15:57.040098(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of AA051611/O0201\n\n\n\nDataset automatically created during the evaluation run of model AA051611/O0201 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-02-02T03:15:57.040098(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
20565e8ac272eca57bb2ced25e25489043033020
# Dataset Card for Evaluation run of BarryFutureman/WestLakeX-7B-EvoMerge-Variant2 <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [BarryFutureman/WestLakeX-7B-EvoMerge-Variant2](https://huggingface.co/BarryFutureman/WestLakeX-7B-EvoMerge-Variant2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_BarryFutureman__WestLakeX-7B-EvoMerge-Variant2", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-02-02T03:18:15.694379](https://huggingface.co/datasets/open-llm-leaderboard/details_BarryFutureman__WestLakeX-7B-EvoMerge-Variant2/blob/main/results_2024-02-02T03-18-15.694379.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6538297853321007, "acc_stderr": 0.0320522890373237, "acc_norm": 0.6530566018124656, "acc_norm_stderr": 0.03272874681048371, "mc1": 0.5618115055079559, "mc1_stderr": 0.01736923616440441, "mc2": 0.7034639754228852, "mc2_stderr": 0.014889031021791599 }, "harness|arc:challenge|25": { "acc": 0.7081911262798635, "acc_stderr": 0.013284525292403511, "acc_norm": 0.7252559726962458, "acc_norm_stderr": 0.013044617212771227 }, "harness|hellaswag|10": { "acc": 0.7144991037641903, "acc_stderr": 0.0045072961962278075, "acc_norm": 0.8851822346146186, "acc_norm_stderr": 0.003181503506054323 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.32, "acc_stderr": 0.046882617226215034, "acc_norm": 0.32, "acc_norm_stderr": 0.046882617226215034 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.6444444444444445, "acc_stderr": 0.04135176749720386, "acc_norm": 0.6444444444444445, "acc_norm_stderr": 0.04135176749720386 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.6973684210526315, "acc_stderr": 0.0373852067611967, "acc_norm": 0.6973684210526315, "acc_norm_stderr": 0.0373852067611967 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.62, "acc_stderr": 0.048783173121456316, "acc_norm": 0.62, "acc_norm_stderr": 0.048783173121456316 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.7132075471698113, "acc_stderr": 0.027834912527544064, "acc_norm": 0.7132075471698113, "acc_norm_stderr": 0.027834912527544064 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.7777777777777778, "acc_stderr": 0.03476590104304134, "acc_norm": 0.7777777777777778, "acc_norm_stderr": 0.03476590104304134 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.45, "acc_stderr": 0.05, "acc_norm": 0.45, "acc_norm_stderr": 0.05 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.55, "acc_stderr": 0.05, "acc_norm": 0.55, "acc_norm_stderr": 0.05 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.34, "acc_stderr": 0.04760952285695235, "acc_norm": 0.34, "acc_norm_stderr": 0.04760952285695235 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6647398843930635, "acc_stderr": 0.03599586301247077, "acc_norm": 0.6647398843930635, "acc_norm_stderr": 0.03599586301247077 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.4117647058823529, "acc_stderr": 0.048971049527263666, "acc_norm": 0.4117647058823529, "acc_norm_stderr": 0.048971049527263666 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.77, "acc_stderr": 0.04229525846816506, "acc_norm": 0.77, "acc_norm_stderr": 0.04229525846816506 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.5659574468085107, "acc_stderr": 0.03240038086792747, "acc_norm": 0.5659574468085107, "acc_norm_stderr": 0.03240038086792747 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.5263157894736842, "acc_stderr": 0.046970851366478626, "acc_norm": 0.5263157894736842, "acc_norm_stderr": 0.046970851366478626 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5586206896551724, "acc_stderr": 0.04137931034482757, "acc_norm": 0.5586206896551724, "acc_norm_stderr": 0.04137931034482757 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.43915343915343913, "acc_stderr": 0.025559920550531003, "acc_norm": 0.43915343915343913, "acc_norm_stderr": 0.025559920550531003 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.5079365079365079, "acc_stderr": 0.044715725362943486, "acc_norm": 0.5079365079365079, "acc_norm_stderr": 0.044715725362943486 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.33, "acc_stderr": 0.047258156262526045, "acc_norm": 0.33, "acc_norm_stderr": 0.047258156262526045 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.7806451612903226, "acc_stderr": 0.023540799358723295, "acc_norm": 0.7806451612903226, "acc_norm_stderr": 0.023540799358723295 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.5024630541871922, "acc_stderr": 0.035179450386910616, "acc_norm": 0.5024630541871922, "acc_norm_stderr": 0.035179450386910616 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.7, "acc_stderr": 0.046056618647183814, "acc_norm": 0.7, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7818181818181819, "acc_stderr": 0.03225078108306289, "acc_norm": 0.7818181818181819, "acc_norm_stderr": 0.03225078108306289 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.8080808080808081, "acc_stderr": 0.028057791672989017, "acc_norm": 0.8080808080808081, "acc_norm_stderr": 0.028057791672989017 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.8963730569948186, "acc_stderr": 0.02199531196364424, "acc_norm": 0.8963730569948186, "acc_norm_stderr": 0.02199531196364424 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.6692307692307692, "acc_stderr": 0.02385479568097112, "acc_norm": 0.6692307692307692, "acc_norm_stderr": 0.02385479568097112 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.3592592592592593, "acc_stderr": 0.02925290592725197, "acc_norm": 0.3592592592592593, "acc_norm_stderr": 0.02925290592725197 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.6764705882352942, "acc_stderr": 0.030388353551886786, "acc_norm": 0.6764705882352942, "acc_norm_stderr": 0.030388353551886786 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.3576158940397351, "acc_stderr": 0.03913453431177258, "acc_norm": 0.3576158940397351, "acc_norm_stderr": 0.03913453431177258 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8422018348623853, "acc_stderr": 0.01563002297009244, "acc_norm": 0.8422018348623853, "acc_norm_stderr": 0.01563002297009244 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.49074074074074076, "acc_stderr": 0.034093869469927006, "acc_norm": 0.49074074074074076, "acc_norm_stderr": 0.034093869469927006 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.8333333333333334, "acc_stderr": 0.026156867523931045, "acc_norm": 0.8333333333333334, "acc_norm_stderr": 0.026156867523931045 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.8016877637130801, "acc_stderr": 0.02595502084162113, "acc_norm": 0.8016877637130801, "acc_norm_stderr": 0.02595502084162113 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6860986547085202, "acc_stderr": 0.031146796482972465, "acc_norm": 0.6860986547085202, "acc_norm_stderr": 0.031146796482972465 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.8015267175572519, "acc_stderr": 0.034981493854624714, "acc_norm": 0.8015267175572519, "acc_norm_stderr": 0.034981493854624714 }, "harness|hendrycksTest-international_law|5": { "acc": 0.7933884297520661, "acc_stderr": 0.03695980128098824, "acc_norm": 0.7933884297520661, "acc_norm_stderr": 0.03695980128098824 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7685185185185185, "acc_stderr": 0.04077494709252626, "acc_norm": 0.7685185185185185, "acc_norm_stderr": 0.04077494709252626 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7668711656441718, "acc_stderr": 0.0332201579577674, "acc_norm": 0.7668711656441718, "acc_norm_stderr": 0.0332201579577674 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.42857142857142855, "acc_stderr": 0.04697113923010212, "acc_norm": 0.42857142857142855, "acc_norm_stderr": 0.04697113923010212 }, "harness|hendrycksTest-management|5": { "acc": 0.7669902912621359, "acc_stderr": 0.04185832598928315, "acc_norm": 0.7669902912621359, "acc_norm_stderr": 0.04185832598928315 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8931623931623932, "acc_stderr": 0.02023714900899093, "acc_norm": 0.8931623931623932, "acc_norm_stderr": 0.02023714900899093 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.74, "acc_stderr": 0.04408440022768078, "acc_norm": 0.74, "acc_norm_stderr": 0.04408440022768078 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8212005108556832, "acc_stderr": 0.013702643715368983, "acc_norm": 0.8212005108556832, "acc_norm_stderr": 0.013702643715368983 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.7398843930635838, "acc_stderr": 0.023618678310069356, "acc_norm": 0.7398843930635838, "acc_norm_stderr": 0.023618678310069356 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.4223463687150838, "acc_stderr": 0.01651959427529712, "acc_norm": 0.4223463687150838, "acc_norm_stderr": 0.01651959427529712 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.7058823529411765, "acc_stderr": 0.02609016250427905, "acc_norm": 0.7058823529411765, "acc_norm_stderr": 0.02609016250427905 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.7170418006430869, "acc_stderr": 0.02558306248998481, "acc_norm": 0.7170418006430869, "acc_norm_stderr": 0.02558306248998481 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.7376543209876543, "acc_stderr": 0.02447722285613511, "acc_norm": 0.7376543209876543, "acc_norm_stderr": 0.02447722285613511 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.48226950354609927, "acc_stderr": 0.02980873964223777, "acc_norm": 0.48226950354609927, "acc_norm_stderr": 0.02980873964223777 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.4667535853976532, "acc_stderr": 0.012741974333897227, "acc_norm": 0.4667535853976532, "acc_norm_stderr": 0.012741974333897227 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.6617647058823529, "acc_stderr": 0.028739328513983572, "acc_norm": 0.6617647058823529, "acc_norm_stderr": 0.028739328513983572 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6748366013071896, "acc_stderr": 0.018950886770806315, "acc_norm": 0.6748366013071896, "acc_norm_stderr": 0.018950886770806315 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6727272727272727, "acc_stderr": 0.0449429086625209, "acc_norm": 0.6727272727272727, "acc_norm_stderr": 0.0449429086625209 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.726530612244898, "acc_stderr": 0.028535560337128448, "acc_norm": 0.726530612244898, "acc_norm_stderr": 0.028535560337128448 }, "harness|hendrycksTest-sociology|5": { "acc": 0.8159203980099502, "acc_stderr": 0.02740385941078685, "acc_norm": 0.8159203980099502, "acc_norm_stderr": 0.02740385941078685 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.87, "acc_stderr": 0.033799766898963086, "acc_norm": 0.87, "acc_norm_stderr": 0.033799766898963086 }, "harness|hendrycksTest-virology|5": { "acc": 0.5481927710843374, "acc_stderr": 0.03874371556587953, "acc_norm": 0.5481927710843374, "acc_norm_stderr": 0.03874371556587953 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8245614035087719, "acc_stderr": 0.029170885500727665, "acc_norm": 0.8245614035087719, "acc_norm_stderr": 0.029170885500727665 }, "harness|truthfulqa:mc|0": { "mc1": 0.5618115055079559, "mc1_stderr": 0.01736923616440441, "mc2": 0.7034639754228852, "mc2_stderr": 0.014889031021791599 }, "harness|winogrande|5": { "acc": 0.8579321231254933, "acc_stderr": 0.009812000391679369 }, "harness|gsm8k|5": { "acc": 0.6830932524639879, "acc_stderr": 0.012815868296721353 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_BarryFutureman__WestLakeX-7B-EvoMerge-Variant2
[ "region:us" ]
2024-02-02T03:20:36+00:00
{"pretty_name": "Evaluation run of BarryFutureman/WestLakeX-7B-EvoMerge-Variant2", "dataset_summary": "Dataset automatically created during the evaluation run of model [BarryFutureman/WestLakeX-7B-EvoMerge-Variant2](https://huggingface.co/BarryFutureman/WestLakeX-7B-EvoMerge-Variant2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_BarryFutureman__WestLakeX-7B-EvoMerge-Variant2\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-02T03:18:15.694379](https://huggingface.co/datasets/open-llm-leaderboard/details_BarryFutureman__WestLakeX-7B-EvoMerge-Variant2/blob/main/results_2024-02-02T03-18-15.694379.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6538297853321007,\n \"acc_stderr\": 0.0320522890373237,\n \"acc_norm\": 0.6530566018124656,\n \"acc_norm_stderr\": 0.03272874681048371,\n \"mc1\": 0.5618115055079559,\n \"mc1_stderr\": 0.01736923616440441,\n \"mc2\": 0.7034639754228852,\n \"mc2_stderr\": 0.014889031021791599\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.7081911262798635,\n \"acc_stderr\": 0.013284525292403511,\n \"acc_norm\": 0.7252559726962458,\n \"acc_norm_stderr\": 0.013044617212771227\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7144991037641903,\n \"acc_stderr\": 0.0045072961962278075,\n \"acc_norm\": 0.8851822346146186,\n \"acc_norm_stderr\": 0.003181503506054323\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6444444444444445,\n \"acc_stderr\": 0.04135176749720386,\n \"acc_norm\": 0.6444444444444445,\n \"acc_norm_stderr\": 0.04135176749720386\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6973684210526315,\n \"acc_stderr\": 0.0373852067611967,\n \"acc_norm\": 0.6973684210526315,\n \"acc_norm_stderr\": 0.0373852067611967\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.62,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.62,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7132075471698113,\n \"acc_stderr\": 0.027834912527544064,\n \"acc_norm\": 0.7132075471698113,\n \"acc_norm_stderr\": 0.027834912527544064\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.03476590104304134,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.03476590104304134\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6647398843930635,\n \"acc_stderr\": 0.03599586301247077,\n \"acc_norm\": 0.6647398843930635,\n \"acc_norm_stderr\": 0.03599586301247077\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.048971049527263666,\n \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.048971049527263666\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.77,\n \"acc_stderr\": 0.04229525846816506,\n \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.04229525846816506\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5659574468085107,\n \"acc_stderr\": 0.03240038086792747,\n \"acc_norm\": 0.5659574468085107,\n \"acc_norm_stderr\": 0.03240038086792747\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5263157894736842,\n \"acc_stderr\": 0.046970851366478626,\n \"acc_norm\": 0.5263157894736842,\n \"acc_norm_stderr\": 0.046970851366478626\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5586206896551724,\n \"acc_stderr\": 0.04137931034482757,\n \"acc_norm\": 0.5586206896551724,\n \"acc_norm_stderr\": 0.04137931034482757\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.43915343915343913,\n \"acc_stderr\": 0.025559920550531003,\n \"acc_norm\": 0.43915343915343913,\n \"acc_norm_stderr\": 0.025559920550531003\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5079365079365079,\n \"acc_stderr\": 0.044715725362943486,\n \"acc_norm\": 0.5079365079365079,\n \"acc_norm_stderr\": 0.044715725362943486\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7806451612903226,\n \"acc_stderr\": 0.023540799358723295,\n \"acc_norm\": 0.7806451612903226,\n \"acc_norm_stderr\": 0.023540799358723295\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5024630541871922,\n \"acc_stderr\": 0.035179450386910616,\n \"acc_norm\": 0.5024630541871922,\n \"acc_norm_stderr\": 0.035179450386910616\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7818181818181819,\n \"acc_stderr\": 0.03225078108306289,\n \"acc_norm\": 0.7818181818181819,\n \"acc_norm_stderr\": 0.03225078108306289\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.8080808080808081,\n \"acc_stderr\": 0.028057791672989017,\n \"acc_norm\": 0.8080808080808081,\n \"acc_norm_stderr\": 0.028057791672989017\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8963730569948186,\n \"acc_stderr\": 0.02199531196364424,\n \"acc_norm\": 0.8963730569948186,\n \"acc_norm_stderr\": 0.02199531196364424\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6692307692307692,\n \"acc_stderr\": 0.02385479568097112,\n \"acc_norm\": 0.6692307692307692,\n \"acc_norm_stderr\": 0.02385479568097112\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3592592592592593,\n \"acc_stderr\": 0.02925290592725197,\n \"acc_norm\": 0.3592592592592593,\n \"acc_norm_stderr\": 0.02925290592725197\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.030388353551886786,\n \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.030388353551886786\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3576158940397351,\n \"acc_stderr\": 0.03913453431177258,\n \"acc_norm\": 0.3576158940397351,\n \"acc_norm_stderr\": 0.03913453431177258\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8422018348623853,\n \"acc_stderr\": 0.01563002297009244,\n \"acc_norm\": 0.8422018348623853,\n \"acc_norm_stderr\": 0.01563002297009244\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.49074074074074076,\n \"acc_stderr\": 0.034093869469927006,\n \"acc_norm\": 0.49074074074074076,\n \"acc_norm_stderr\": 0.034093869469927006\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8333333333333334,\n \"acc_stderr\": 0.026156867523931045,\n \"acc_norm\": 0.8333333333333334,\n \"acc_norm_stderr\": 0.026156867523931045\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8016877637130801,\n \"acc_stderr\": 0.02595502084162113,\n \"acc_norm\": 0.8016877637130801,\n \"acc_norm_stderr\": 0.02595502084162113\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8015267175572519,\n \"acc_stderr\": 0.034981493854624714,\n \"acc_norm\": 0.8015267175572519,\n \"acc_norm_stderr\": 0.034981493854624714\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7933884297520661,\n \"acc_stderr\": 0.03695980128098824,\n \"acc_norm\": 0.7933884297520661,\n \"acc_norm_stderr\": 0.03695980128098824\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n \"acc_stderr\": 0.04077494709252626,\n \"acc_norm\": 0.7685185185185185,\n \"acc_norm_stderr\": 0.04077494709252626\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7668711656441718,\n \"acc_stderr\": 0.0332201579577674,\n \"acc_norm\": 0.7668711656441718,\n \"acc_norm_stderr\": 0.0332201579577674\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.42857142857142855,\n \"acc_stderr\": 0.04697113923010212,\n \"acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.04697113923010212\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8931623931623932,\n \"acc_stderr\": 0.02023714900899093,\n \"acc_norm\": 0.8931623931623932,\n \"acc_norm_stderr\": 0.02023714900899093\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8212005108556832,\n \"acc_stderr\": 0.013702643715368983,\n \"acc_norm\": 0.8212005108556832,\n \"acc_norm_stderr\": 0.013702643715368983\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7398843930635838,\n \"acc_stderr\": 0.023618678310069356,\n \"acc_norm\": 0.7398843930635838,\n \"acc_norm_stderr\": 0.023618678310069356\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4223463687150838,\n \"acc_stderr\": 0.01651959427529712,\n \"acc_norm\": 0.4223463687150838,\n \"acc_norm_stderr\": 0.01651959427529712\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7058823529411765,\n \"acc_stderr\": 0.02609016250427905,\n \"acc_norm\": 0.7058823529411765,\n \"acc_norm_stderr\": 0.02609016250427905\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7170418006430869,\n \"acc_stderr\": 0.02558306248998481,\n \"acc_norm\": 0.7170418006430869,\n \"acc_norm_stderr\": 0.02558306248998481\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7376543209876543,\n \"acc_stderr\": 0.02447722285613511,\n \"acc_norm\": 0.7376543209876543,\n \"acc_norm_stderr\": 0.02447722285613511\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.48226950354609927,\n \"acc_stderr\": 0.02980873964223777,\n \"acc_norm\": 0.48226950354609927,\n \"acc_norm_stderr\": 0.02980873964223777\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4667535853976532,\n \"acc_stderr\": 0.012741974333897227,\n \"acc_norm\": 0.4667535853976532,\n \"acc_norm_stderr\": 0.012741974333897227\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6617647058823529,\n \"acc_stderr\": 0.028739328513983572,\n \"acc_norm\": 0.6617647058823529,\n \"acc_norm_stderr\": 0.028739328513983572\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6748366013071896,\n \"acc_stderr\": 0.018950886770806315,\n \"acc_norm\": 0.6748366013071896,\n \"acc_norm_stderr\": 0.018950886770806315\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.726530612244898,\n \"acc_stderr\": 0.028535560337128448,\n \"acc_norm\": 0.726530612244898,\n \"acc_norm_stderr\": 0.028535560337128448\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8159203980099502,\n \"acc_stderr\": 0.02740385941078685,\n \"acc_norm\": 0.8159203980099502,\n \"acc_norm_stderr\": 0.02740385941078685\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.87,\n \"acc_stderr\": 0.033799766898963086,\n \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.033799766898963086\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5481927710843374,\n \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.5481927710843374,\n \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8245614035087719,\n \"acc_stderr\": 0.029170885500727665,\n \"acc_norm\": 0.8245614035087719,\n \"acc_norm_stderr\": 0.029170885500727665\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5618115055079559,\n \"mc1_stderr\": 0.01736923616440441,\n \"mc2\": 0.7034639754228852,\n \"mc2_stderr\": 0.014889031021791599\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8579321231254933,\n \"acc_stderr\": 0.009812000391679369\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6830932524639879,\n \"acc_stderr\": 0.012815868296721353\n }\n}\n```", "repo_url": "https://huggingface.co/BarryFutureman/WestLakeX-7B-EvoMerge-Variant2", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_02T03_18_15.694379", "path": ["**/details_harness|arc:challenge|25_2024-02-02T03-18-15.694379.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-02T03-18-15.694379.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_02T03_18_15.694379", "path": ["**/details_harness|gsm8k|5_2024-02-02T03-18-15.694379.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-02T03-18-15.694379.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_02T03_18_15.694379", "path": ["**/details_harness|hellaswag|10_2024-02-02T03-18-15.694379.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-02T03-18-15.694379.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_02T03_18_15.694379", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T03-18-15.694379.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-02T03-18-15.694379.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-02T03-18-15.694379.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T03-18-15.694379.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T03-18-15.694379.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-02T03-18-15.694379.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T03-18-15.694379.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T03-18-15.694379.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T03-18-15.694379.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T03-18-15.694379.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-02T03-18-15.694379.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-02T03-18-15.694379.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T03-18-15.694379.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-02T03-18-15.694379.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T03-18-15.694379.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T03-18-15.694379.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T03-18-15.694379.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-02T03-18-15.694379.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T03-18-15.694379.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T03-18-15.694379.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T03-18-15.694379.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T03-18-15.694379.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T03-18-15.694379.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T03-18-15.694379.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T03-18-15.694379.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T03-18-15.694379.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T03-18-15.694379.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T03-18-15.694379.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T03-18-15.694379.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T03-18-15.694379.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T03-18-15.694379.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T03-18-15.694379.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-02T03-18-15.694379.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T03-18-15.694379.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-02T03-18-15.694379.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T03-18-15.694379.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T03-18-15.694379.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T03-18-15.694379.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-02T03-18-15.694379.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-02T03-18-15.694379.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T03-18-15.694379.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T03-18-15.694379.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T03-18-15.694379.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T03-18-15.694379.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-02T03-18-15.694379.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-02T03-18-15.694379.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-02T03-18-15.694379.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T03-18-15.694379.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-02T03-18-15.694379.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T03-18-15.694379.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T03-18-15.694379.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-02T03-18-15.694379.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-02T03-18-15.694379.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-02T03-18-15.694379.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T03-18-15.694379.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-02T03-18-15.694379.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-02T03-18-15.694379.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T03-18-15.694379.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-02T03-18-15.694379.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-02T03-18-15.694379.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T03-18-15.694379.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T03-18-15.694379.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-02T03-18-15.694379.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T03-18-15.694379.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T03-18-15.694379.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T03-18-15.694379.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T03-18-15.694379.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-02T03-18-15.694379.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-02T03-18-15.694379.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T03-18-15.694379.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-02T03-18-15.694379.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T03-18-15.694379.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T03-18-15.694379.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T03-18-15.694379.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-02T03-18-15.694379.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T03-18-15.694379.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T03-18-15.694379.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T03-18-15.694379.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T03-18-15.694379.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T03-18-15.694379.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T03-18-15.694379.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T03-18-15.694379.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T03-18-15.694379.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T03-18-15.694379.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T03-18-15.694379.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T03-18-15.694379.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T03-18-15.694379.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T03-18-15.694379.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T03-18-15.694379.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-02T03-18-15.694379.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T03-18-15.694379.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-02T03-18-15.694379.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T03-18-15.694379.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T03-18-15.694379.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T03-18-15.694379.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-02T03-18-15.694379.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-02T03-18-15.694379.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T03-18-15.694379.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T03-18-15.694379.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T03-18-15.694379.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T03-18-15.694379.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-02T03-18-15.694379.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-02T03-18-15.694379.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-02T03-18-15.694379.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T03-18-15.694379.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-02T03-18-15.694379.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T03-18-15.694379.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T03-18-15.694379.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-02T03-18-15.694379.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-02T03-18-15.694379.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-02T03-18-15.694379.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T03-18-15.694379.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-02T03-18-15.694379.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-02T03-18-15.694379.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_02T03_18_15.694379", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T03-18-15.694379.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T03-18-15.694379.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_02T03_18_15.694379", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-02T03-18-15.694379.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-02T03-18-15.694379.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_02T03_18_15.694379", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-02T03-18-15.694379.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-02T03-18-15.694379.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_02T03_18_15.694379", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T03-18-15.694379.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T03-18-15.694379.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_02T03_18_15.694379", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T03-18-15.694379.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T03-18-15.694379.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_02T03_18_15.694379", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-02T03-18-15.694379.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-02T03-18-15.694379.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_02T03_18_15.694379", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T03-18-15.694379.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T03-18-15.694379.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_02T03_18_15.694379", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T03-18-15.694379.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T03-18-15.694379.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_02T03_18_15.694379", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T03-18-15.694379.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T03-18-15.694379.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_02T03_18_15.694379", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T03-18-15.694379.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T03-18-15.694379.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_02T03_18_15.694379", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-02T03-18-15.694379.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-02T03-18-15.694379.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_02T03_18_15.694379", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-02T03-18-15.694379.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-02T03-18-15.694379.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_02T03_18_15.694379", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T03-18-15.694379.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T03-18-15.694379.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_02T03_18_15.694379", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-02T03-18-15.694379.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-02T03-18-15.694379.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_02T03_18_15.694379", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T03-18-15.694379.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T03-18-15.694379.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_02T03_18_15.694379", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T03-18-15.694379.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T03-18-15.694379.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_02T03_18_15.694379", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T03-18-15.694379.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T03-18-15.694379.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_02T03_18_15.694379", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-02T03-18-15.694379.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-02T03-18-15.694379.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_02T03_18_15.694379", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T03-18-15.694379.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T03-18-15.694379.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_02T03_18_15.694379", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T03-18-15.694379.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T03-18-15.694379.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_02T03_18_15.694379", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T03-18-15.694379.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T03-18-15.694379.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_02T03_18_15.694379", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T03-18-15.694379.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T03-18-15.694379.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_02T03_18_15.694379", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T03-18-15.694379.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T03-18-15.694379.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_02T03_18_15.694379", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T03-18-15.694379.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T03-18-15.694379.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_02T03_18_15.694379", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T03-18-15.694379.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T03-18-15.694379.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_02T03_18_15.694379", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T03-18-15.694379.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T03-18-15.694379.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_02T03_18_15.694379", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T03-18-15.694379.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T03-18-15.694379.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_02T03_18_15.694379", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T03-18-15.694379.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T03-18-15.694379.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_02T03_18_15.694379", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T03-18-15.694379.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T03-18-15.694379.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_02T03_18_15.694379", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T03-18-15.694379.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T03-18-15.694379.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_02T03_18_15.694379", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T03-18-15.694379.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T03-18-15.694379.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_02T03_18_15.694379", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T03-18-15.694379.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T03-18-15.694379.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_02T03_18_15.694379", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-02T03-18-15.694379.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-02T03-18-15.694379.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_02T03_18_15.694379", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T03-18-15.694379.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T03-18-15.694379.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_02T03_18_15.694379", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-02T03-18-15.694379.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-02T03-18-15.694379.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_02T03_18_15.694379", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T03-18-15.694379.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T03-18-15.694379.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_02T03_18_15.694379", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T03-18-15.694379.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T03-18-15.694379.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_02T03_18_15.694379", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T03-18-15.694379.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T03-18-15.694379.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_02T03_18_15.694379", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-02T03-18-15.694379.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-02T03-18-15.694379.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_02T03_18_15.694379", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-02T03-18-15.694379.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-02T03-18-15.694379.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_02T03_18_15.694379", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T03-18-15.694379.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T03-18-15.694379.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_02T03_18_15.694379", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T03-18-15.694379.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T03-18-15.694379.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_02T03_18_15.694379", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T03-18-15.694379.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T03-18-15.694379.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_02T03_18_15.694379", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T03-18-15.694379.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T03-18-15.694379.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_02T03_18_15.694379", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-02T03-18-15.694379.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-02T03-18-15.694379.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_02T03_18_15.694379", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-02T03-18-15.694379.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-02T03-18-15.694379.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_02T03_18_15.694379", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-02T03-18-15.694379.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-02T03-18-15.694379.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_02T03_18_15.694379", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T03-18-15.694379.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T03-18-15.694379.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_02T03_18_15.694379", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-02T03-18-15.694379.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-02T03-18-15.694379.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_02T03_18_15.694379", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T03-18-15.694379.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T03-18-15.694379.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_02T03_18_15.694379", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T03-18-15.694379.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T03-18-15.694379.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_02T03_18_15.694379", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-02T03-18-15.694379.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-02T03-18-15.694379.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_02T03_18_15.694379", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-02T03-18-15.694379.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-02T03-18-15.694379.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_02T03_18_15.694379", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-02T03-18-15.694379.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-02T03-18-15.694379.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_02T03_18_15.694379", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T03-18-15.694379.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T03-18-15.694379.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_02T03_18_15.694379", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-02T03-18-15.694379.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-02T03-18-15.694379.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_02T03_18_15.694379", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-02T03-18-15.694379.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-02T03-18-15.694379.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_02T03_18_15.694379", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-02T03-18-15.694379.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-02T03-18-15.694379.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_02T03_18_15.694379", "path": ["**/details_harness|winogrande|5_2024-02-02T03-18-15.694379.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-02T03-18-15.694379.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_02T03_18_15.694379", "path": ["results_2024-02-02T03-18-15.694379.parquet"]}, {"split": "latest", "path": ["results_2024-02-02T03-18-15.694379.parquet"]}]}]}
2024-02-02T03:20:59+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of BarryFutureman/WestLakeX-7B-EvoMerge-Variant2 Dataset automatically created during the evaluation run of model BarryFutureman/WestLakeX-7B-EvoMerge-Variant2 on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-02-02T03:18:15.694379(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of BarryFutureman/WestLakeX-7B-EvoMerge-Variant2\n\n\n\nDataset automatically created during the evaluation run of model BarryFutureman/WestLakeX-7B-EvoMerge-Variant2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-02-02T03:18:15.694379(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of BarryFutureman/WestLakeX-7B-EvoMerge-Variant2\n\n\n\nDataset automatically created during the evaluation run of model BarryFutureman/WestLakeX-7B-EvoMerge-Variant2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-02-02T03:18:15.694379(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
1311abe6f0635a013b89ae962f06fb161e86db1f
# Dataset Card for Evaluation run of Samee-ur/NeuralPipe-7B-slerp <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [Samee-ur/NeuralPipe-7B-slerp](https://huggingface.co/Samee-ur/NeuralPipe-7B-slerp) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_Samee-ur__NeuralPipe-7B-slerp", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-02-02T03:25:19.988005](https://huggingface.co/datasets/open-llm-leaderboard/details_Samee-ur__NeuralPipe-7B-slerp/blob/main/results_2024-02-02T03-25-19.988005.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6444688446653744, "acc_stderr": 0.03217564834975917, "acc_norm": 0.6448609553287138, "acc_norm_stderr": 0.032833467276313325, "mc1": 0.4283965728274174, "mc1_stderr": 0.017323088597314754, "mc2": 0.5985018412437423, "mc2_stderr": 0.01514980059720055 }, "harness|arc:challenge|25": { "acc": 0.6476109215017065, "acc_stderr": 0.013960142600598675, "acc_norm": 0.6774744027303754, "acc_norm_stderr": 0.013659980894277364 }, "harness|hellaswag|10": { "acc": 0.6700856403106951, "acc_stderr": 0.004692208279690595, "acc_norm": 0.8616809400517825, "acc_norm_stderr": 0.0034452899250117337 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.3, "acc_stderr": 0.046056618647183814, "acc_norm": 0.3, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.6074074074074074, "acc_stderr": 0.0421850621536888, "acc_norm": 0.6074074074074074, "acc_norm_stderr": 0.0421850621536888 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.7105263157894737, "acc_stderr": 0.03690677986137283, "acc_norm": 0.7105263157894737, "acc_norm_stderr": 0.03690677986137283 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.61, "acc_stderr": 0.04902071300001975, "acc_norm": 0.61, "acc_norm_stderr": 0.04902071300001975 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.6830188679245283, "acc_stderr": 0.02863723563980089, "acc_norm": 0.6830188679245283, "acc_norm_stderr": 0.02863723563980089 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.7777777777777778, "acc_stderr": 0.03476590104304134, "acc_norm": 0.7777777777777778, "acc_norm_stderr": 0.03476590104304134 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.45, "acc_stderr": 0.05, "acc_norm": 0.45, "acc_norm_stderr": 0.05 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.49, "acc_stderr": 0.05024183937956912, "acc_norm": 0.49, "acc_norm_stderr": 0.05024183937956912 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.29, "acc_stderr": 0.045604802157206845, "acc_norm": 0.29, "acc_norm_stderr": 0.045604802157206845 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6473988439306358, "acc_stderr": 0.036430371689585475, "acc_norm": 0.6473988439306358, "acc_norm_stderr": 0.036430371689585475 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.39215686274509803, "acc_stderr": 0.04858083574266345, "acc_norm": 0.39215686274509803, "acc_norm_stderr": 0.04858083574266345 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.74, "acc_stderr": 0.04408440022768078, "acc_norm": 0.74, "acc_norm_stderr": 0.04408440022768078 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.5829787234042553, "acc_stderr": 0.03223276266711712, "acc_norm": 0.5829787234042553, "acc_norm_stderr": 0.03223276266711712 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.5, "acc_stderr": 0.047036043419179864, "acc_norm": 0.5, "acc_norm_stderr": 0.047036043419179864 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5448275862068965, "acc_stderr": 0.04149886942192117, "acc_norm": 0.5448275862068965, "acc_norm_stderr": 0.04149886942192117 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.41534391534391535, "acc_stderr": 0.025379524910778405, "acc_norm": 0.41534391534391535, "acc_norm_stderr": 0.025379524910778405 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.4603174603174603, "acc_stderr": 0.04458029125470973, "acc_norm": 0.4603174603174603, "acc_norm_stderr": 0.04458029125470973 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.38, "acc_stderr": 0.048783173121456316, "acc_norm": 0.38, "acc_norm_stderr": 0.048783173121456316 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.7774193548387097, "acc_stderr": 0.023664216671642518, "acc_norm": 0.7774193548387097, "acc_norm_stderr": 0.023664216671642518 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.5073891625615764, "acc_stderr": 0.035176035403610105, "acc_norm": 0.5073891625615764, "acc_norm_stderr": 0.035176035403610105 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.69, "acc_stderr": 0.04648231987117316, "acc_norm": 0.69, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7696969696969697, "acc_stderr": 0.0328766675860349, "acc_norm": 0.7696969696969697, "acc_norm_stderr": 0.0328766675860349 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.7878787878787878, "acc_stderr": 0.029126522834586818, "acc_norm": 0.7878787878787878, "acc_norm_stderr": 0.029126522834586818 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.9015544041450777, "acc_stderr": 0.02150024957603346, "acc_norm": 0.9015544041450777, "acc_norm_stderr": 0.02150024957603346 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.6538461538461539, "acc_stderr": 0.02412112541694119, "acc_norm": 0.6538461538461539, "acc_norm_stderr": 0.02412112541694119 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.32222222222222224, "acc_stderr": 0.028493465091028593, "acc_norm": 0.32222222222222224, "acc_norm_stderr": 0.028493465091028593 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.6890756302521008, "acc_stderr": 0.03006676158297793, "acc_norm": 0.6890756302521008, "acc_norm_stderr": 0.03006676158297793 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.32450331125827814, "acc_stderr": 0.03822746937658752, "acc_norm": 0.32450331125827814, "acc_norm_stderr": 0.03822746937658752 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8550458715596331, "acc_stderr": 0.01509421569970048, "acc_norm": 0.8550458715596331, "acc_norm_stderr": 0.01509421569970048 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.5185185185185185, "acc_stderr": 0.034076320938540516, "acc_norm": 0.5185185185185185, "acc_norm_stderr": 0.034076320938540516 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.8186274509803921, "acc_stderr": 0.027044621719474082, "acc_norm": 0.8186274509803921, "acc_norm_stderr": 0.027044621719474082 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.8059071729957806, "acc_stderr": 0.0257449025322909, "acc_norm": 0.8059071729957806, "acc_norm_stderr": 0.0257449025322909 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6905829596412556, "acc_stderr": 0.03102441174057221, "acc_norm": 0.6905829596412556, "acc_norm_stderr": 0.03102441174057221 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.7786259541984732, "acc_stderr": 0.03641297081313729, "acc_norm": 0.7786259541984732, "acc_norm_stderr": 0.03641297081313729 }, "harness|hendrycksTest-international_law|5": { "acc": 0.8099173553719008, "acc_stderr": 0.03581796951709282, "acc_norm": 0.8099173553719008, "acc_norm_stderr": 0.03581796951709282 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7685185185185185, "acc_stderr": 0.04077494709252626, "acc_norm": 0.7685185185185185, "acc_norm_stderr": 0.04077494709252626 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7730061349693251, "acc_stderr": 0.03291099578615769, "acc_norm": 0.7730061349693251, "acc_norm_stderr": 0.03291099578615769 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.4642857142857143, "acc_stderr": 0.04733667890053756, "acc_norm": 0.4642857142857143, "acc_norm_stderr": 0.04733667890053756 }, "harness|hendrycksTest-management|5": { "acc": 0.7572815533980582, "acc_stderr": 0.04245022486384495, "acc_norm": 0.7572815533980582, "acc_norm_stderr": 0.04245022486384495 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8547008547008547, "acc_stderr": 0.023086635086841407, "acc_norm": 0.8547008547008547, "acc_norm_stderr": 0.023086635086841407 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.7, "acc_stderr": 0.046056618647183814, "acc_norm": 0.7, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8352490421455939, "acc_stderr": 0.013265346261323793, "acc_norm": 0.8352490421455939, "acc_norm_stderr": 0.013265346261323793 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.7283236994219653, "acc_stderr": 0.023948512905468365, "acc_norm": 0.7283236994219653, "acc_norm_stderr": 0.023948512905468365 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.36312849162011174, "acc_stderr": 0.016083749986853697, "acc_norm": 0.36312849162011174, "acc_norm_stderr": 0.016083749986853697 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.7450980392156863, "acc_stderr": 0.02495418432487991, "acc_norm": 0.7450980392156863, "acc_norm_stderr": 0.02495418432487991 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.7106109324758842, "acc_stderr": 0.025755865922632945, "acc_norm": 0.7106109324758842, "acc_norm_stderr": 0.025755865922632945 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.7469135802469136, "acc_stderr": 0.024191808600712995, "acc_norm": 0.7469135802469136, "acc_norm_stderr": 0.024191808600712995 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.4787234042553192, "acc_stderr": 0.029800481645628693, "acc_norm": 0.4787234042553192, "acc_norm_stderr": 0.029800481645628693 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.4726205997392438, "acc_stderr": 0.012751075788015058, "acc_norm": 0.4726205997392438, "acc_norm_stderr": 0.012751075788015058 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.6948529411764706, "acc_stderr": 0.027971541370170598, "acc_norm": 0.6948529411764706, "acc_norm_stderr": 0.027971541370170598 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6748366013071896, "acc_stderr": 0.01895088677080631, "acc_norm": 0.6748366013071896, "acc_norm_stderr": 0.01895088677080631 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6636363636363637, "acc_stderr": 0.04525393596302506, "acc_norm": 0.6636363636363637, "acc_norm_stderr": 0.04525393596302506 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.746938775510204, "acc_stderr": 0.027833023871399673, "acc_norm": 0.746938775510204, "acc_norm_stderr": 0.027833023871399673 }, "harness|hendrycksTest-sociology|5": { "acc": 0.835820895522388, "acc_stderr": 0.026193923544454115, "acc_norm": 0.835820895522388, "acc_norm_stderr": 0.026193923544454115 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.85, "acc_stderr": 0.0358870281282637, "acc_norm": 0.85, "acc_norm_stderr": 0.0358870281282637 }, "harness|hendrycksTest-virology|5": { "acc": 0.5301204819277109, "acc_stderr": 0.03885425420866767, "acc_norm": 0.5301204819277109, "acc_norm_stderr": 0.03885425420866767 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8304093567251462, "acc_stderr": 0.02878210810540171, "acc_norm": 0.8304093567251462, "acc_norm_stderr": 0.02878210810540171 }, "harness|truthfulqa:mc|0": { "mc1": 0.4283965728274174, "mc1_stderr": 0.017323088597314754, "mc2": 0.5985018412437423, "mc2_stderr": 0.01514980059720055 }, "harness|winogrande|5": { "acc": 0.8018942383583267, "acc_stderr": 0.01120186274448705 }, "harness|gsm8k|5": { "acc": 0.6853677028051555, "acc_stderr": 0.01279103722733604 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_Samee-ur__NeuralPipe-7B-slerp
[ "region:us" ]
2024-02-02T03:27:44+00:00
{"pretty_name": "Evaluation run of Samee-ur/NeuralPipe-7B-slerp", "dataset_summary": "Dataset automatically created during the evaluation run of model [Samee-ur/NeuralPipe-7B-slerp](https://huggingface.co/Samee-ur/NeuralPipe-7B-slerp) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Samee-ur__NeuralPipe-7B-slerp\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-02T03:25:19.988005](https://huggingface.co/datasets/open-llm-leaderboard/details_Samee-ur__NeuralPipe-7B-slerp/blob/main/results_2024-02-02T03-25-19.988005.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6444688446653744,\n \"acc_stderr\": 0.03217564834975917,\n \"acc_norm\": 0.6448609553287138,\n \"acc_norm_stderr\": 0.032833467276313325,\n \"mc1\": 0.4283965728274174,\n \"mc1_stderr\": 0.017323088597314754,\n \"mc2\": 0.5985018412437423,\n \"mc2_stderr\": 0.01514980059720055\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6476109215017065,\n \"acc_stderr\": 0.013960142600598675,\n \"acc_norm\": 0.6774744027303754,\n \"acc_norm_stderr\": 0.013659980894277364\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6700856403106951,\n \"acc_stderr\": 0.004692208279690595,\n \"acc_norm\": 0.8616809400517825,\n \"acc_norm_stderr\": 0.0034452899250117337\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6074074074074074,\n \"acc_stderr\": 0.0421850621536888,\n \"acc_norm\": 0.6074074074074074,\n \"acc_norm_stderr\": 0.0421850621536888\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7105263157894737,\n \"acc_stderr\": 0.03690677986137283,\n \"acc_norm\": 0.7105263157894737,\n \"acc_norm_stderr\": 0.03690677986137283\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.61,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6830188679245283,\n \"acc_stderr\": 0.02863723563980089,\n \"acc_norm\": 0.6830188679245283,\n \"acc_norm_stderr\": 0.02863723563980089\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.03476590104304134,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.03476590104304134\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6473988439306358,\n \"acc_stderr\": 0.036430371689585475,\n \"acc_norm\": 0.6473988439306358,\n \"acc_norm_stderr\": 0.036430371689585475\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.04858083574266345,\n \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.04858083574266345\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5829787234042553,\n \"acc_stderr\": 0.03223276266711712,\n \"acc_norm\": 0.5829787234042553,\n \"acc_norm_stderr\": 0.03223276266711712\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.047036043419179864,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.047036043419179864\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5448275862068965,\n \"acc_stderr\": 0.04149886942192117,\n \"acc_norm\": 0.5448275862068965,\n \"acc_norm_stderr\": 0.04149886942192117\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.41534391534391535,\n \"acc_stderr\": 0.025379524910778405,\n \"acc_norm\": 0.41534391534391535,\n \"acc_norm_stderr\": 0.025379524910778405\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4603174603174603,\n \"acc_stderr\": 0.04458029125470973,\n \"acc_norm\": 0.4603174603174603,\n \"acc_norm_stderr\": 0.04458029125470973\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7774193548387097,\n \"acc_stderr\": 0.023664216671642518,\n \"acc_norm\": 0.7774193548387097,\n \"acc_norm_stderr\": 0.023664216671642518\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5073891625615764,\n \"acc_stderr\": 0.035176035403610105,\n \"acc_norm\": 0.5073891625615764,\n \"acc_norm_stderr\": 0.035176035403610105\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.0328766675860349,\n \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.0328766675860349\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.029126522834586818,\n \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.029126522834586818\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9015544041450777,\n \"acc_stderr\": 0.02150024957603346,\n \"acc_norm\": 0.9015544041450777,\n \"acc_norm_stderr\": 0.02150024957603346\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6538461538461539,\n \"acc_stderr\": 0.02412112541694119,\n \"acc_norm\": 0.6538461538461539,\n \"acc_norm_stderr\": 0.02412112541694119\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.32222222222222224,\n \"acc_stderr\": 0.028493465091028593,\n \"acc_norm\": 0.32222222222222224,\n \"acc_norm_stderr\": 0.028493465091028593\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6890756302521008,\n \"acc_stderr\": 0.03006676158297793,\n \"acc_norm\": 0.6890756302521008,\n \"acc_norm_stderr\": 0.03006676158297793\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.32450331125827814,\n \"acc_stderr\": 0.03822746937658752,\n \"acc_norm\": 0.32450331125827814,\n \"acc_norm_stderr\": 0.03822746937658752\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8550458715596331,\n \"acc_stderr\": 0.01509421569970048,\n \"acc_norm\": 0.8550458715596331,\n \"acc_norm_stderr\": 0.01509421569970048\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5185185185185185,\n \"acc_stderr\": 0.034076320938540516,\n \"acc_norm\": 0.5185185185185185,\n \"acc_norm_stderr\": 0.034076320938540516\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8186274509803921,\n \"acc_stderr\": 0.027044621719474082,\n \"acc_norm\": 0.8186274509803921,\n \"acc_norm_stderr\": 0.027044621719474082\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8059071729957806,\n \"acc_stderr\": 0.0257449025322909,\n \"acc_norm\": 0.8059071729957806,\n \"acc_norm_stderr\": 0.0257449025322909\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n \"acc_stderr\": 0.03102441174057221,\n \"acc_norm\": 0.6905829596412556,\n \"acc_norm_stderr\": 0.03102441174057221\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7786259541984732,\n \"acc_stderr\": 0.03641297081313729,\n \"acc_norm\": 0.7786259541984732,\n \"acc_norm_stderr\": 0.03641297081313729\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8099173553719008,\n \"acc_stderr\": 0.03581796951709282,\n \"acc_norm\": 0.8099173553719008,\n \"acc_norm_stderr\": 0.03581796951709282\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n \"acc_stderr\": 0.04077494709252626,\n \"acc_norm\": 0.7685185185185185,\n \"acc_norm_stderr\": 0.04077494709252626\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7730061349693251,\n \"acc_stderr\": 0.03291099578615769,\n \"acc_norm\": 0.7730061349693251,\n \"acc_norm_stderr\": 0.03291099578615769\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4642857142857143,\n \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.4642857142857143,\n \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384495,\n \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384495\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8547008547008547,\n \"acc_stderr\": 0.023086635086841407,\n \"acc_norm\": 0.8547008547008547,\n \"acc_norm_stderr\": 0.023086635086841407\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8352490421455939,\n \"acc_stderr\": 0.013265346261323793,\n \"acc_norm\": 0.8352490421455939,\n \"acc_norm_stderr\": 0.013265346261323793\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7283236994219653,\n \"acc_stderr\": 0.023948512905468365,\n \"acc_norm\": 0.7283236994219653,\n \"acc_norm_stderr\": 0.023948512905468365\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.36312849162011174,\n \"acc_stderr\": 0.016083749986853697,\n \"acc_norm\": 0.36312849162011174,\n \"acc_norm_stderr\": 0.016083749986853697\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7450980392156863,\n \"acc_stderr\": 0.02495418432487991,\n \"acc_norm\": 0.7450980392156863,\n \"acc_norm_stderr\": 0.02495418432487991\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7106109324758842,\n \"acc_stderr\": 0.025755865922632945,\n \"acc_norm\": 0.7106109324758842,\n \"acc_norm_stderr\": 0.025755865922632945\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7469135802469136,\n \"acc_stderr\": 0.024191808600712995,\n \"acc_norm\": 0.7469135802469136,\n \"acc_norm_stderr\": 0.024191808600712995\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4787234042553192,\n \"acc_stderr\": 0.029800481645628693,\n \"acc_norm\": 0.4787234042553192,\n \"acc_norm_stderr\": 0.029800481645628693\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4726205997392438,\n \"acc_stderr\": 0.012751075788015058,\n \"acc_norm\": 0.4726205997392438,\n \"acc_norm_stderr\": 0.012751075788015058\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6948529411764706,\n \"acc_stderr\": 0.027971541370170598,\n \"acc_norm\": 0.6948529411764706,\n \"acc_norm_stderr\": 0.027971541370170598\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6748366013071896,\n \"acc_stderr\": 0.01895088677080631,\n \"acc_norm\": 0.6748366013071896,\n \"acc_norm_stderr\": 0.01895088677080631\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.746938775510204,\n \"acc_stderr\": 0.027833023871399673,\n \"acc_norm\": 0.746938775510204,\n \"acc_norm_stderr\": 0.027833023871399673\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n \"acc_stderr\": 0.026193923544454115,\n \"acc_norm\": 0.835820895522388,\n \"acc_norm_stderr\": 0.026193923544454115\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.85,\n \"acc_stderr\": 0.0358870281282637,\n \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.0358870281282637\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5301204819277109,\n \"acc_stderr\": 0.03885425420866767,\n \"acc_norm\": 0.5301204819277109,\n \"acc_norm_stderr\": 0.03885425420866767\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4283965728274174,\n \"mc1_stderr\": 0.017323088597314754,\n \"mc2\": 0.5985018412437423,\n \"mc2_stderr\": 0.01514980059720055\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8018942383583267,\n \"acc_stderr\": 0.01120186274448705\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6853677028051555,\n \"acc_stderr\": 0.01279103722733604\n }\n}\n```", "repo_url": "https://huggingface.co/Samee-ur/NeuralPipe-7B-slerp", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_02T03_25_19.988005", "path": ["**/details_harness|arc:challenge|25_2024-02-02T03-25-19.988005.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-02T03-25-19.988005.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_02T03_25_19.988005", "path": ["**/details_harness|gsm8k|5_2024-02-02T03-25-19.988005.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-02T03-25-19.988005.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_02T03_25_19.988005", "path": ["**/details_harness|hellaswag|10_2024-02-02T03-25-19.988005.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-02T03-25-19.988005.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_02T03_25_19.988005", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T03-25-19.988005.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-02T03-25-19.988005.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-02T03-25-19.988005.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T03-25-19.988005.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T03-25-19.988005.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-02T03-25-19.988005.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T03-25-19.988005.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T03-25-19.988005.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T03-25-19.988005.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T03-25-19.988005.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-02T03-25-19.988005.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-02T03-25-19.988005.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T03-25-19.988005.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-02T03-25-19.988005.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T03-25-19.988005.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T03-25-19.988005.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T03-25-19.988005.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-02T03-25-19.988005.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T03-25-19.988005.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T03-25-19.988005.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T03-25-19.988005.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T03-25-19.988005.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T03-25-19.988005.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T03-25-19.988005.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T03-25-19.988005.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T03-25-19.988005.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T03-25-19.988005.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T03-25-19.988005.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T03-25-19.988005.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T03-25-19.988005.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T03-25-19.988005.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T03-25-19.988005.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-02T03-25-19.988005.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T03-25-19.988005.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-02T03-25-19.988005.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T03-25-19.988005.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T03-25-19.988005.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T03-25-19.988005.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-02T03-25-19.988005.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-02T03-25-19.988005.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T03-25-19.988005.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T03-25-19.988005.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T03-25-19.988005.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T03-25-19.988005.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-02T03-25-19.988005.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-02T03-25-19.988005.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-02T03-25-19.988005.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T03-25-19.988005.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-02T03-25-19.988005.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T03-25-19.988005.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T03-25-19.988005.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-02T03-25-19.988005.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-02T03-25-19.988005.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-02T03-25-19.988005.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T03-25-19.988005.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-02T03-25-19.988005.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-02T03-25-19.988005.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T03-25-19.988005.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-02T03-25-19.988005.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-02T03-25-19.988005.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T03-25-19.988005.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T03-25-19.988005.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-02T03-25-19.988005.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T03-25-19.988005.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T03-25-19.988005.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T03-25-19.988005.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T03-25-19.988005.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-02T03-25-19.988005.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-02T03-25-19.988005.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T03-25-19.988005.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-02T03-25-19.988005.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T03-25-19.988005.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T03-25-19.988005.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T03-25-19.988005.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-02T03-25-19.988005.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T03-25-19.988005.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T03-25-19.988005.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T03-25-19.988005.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T03-25-19.988005.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T03-25-19.988005.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T03-25-19.988005.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T03-25-19.988005.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T03-25-19.988005.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T03-25-19.988005.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T03-25-19.988005.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T03-25-19.988005.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T03-25-19.988005.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T03-25-19.988005.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T03-25-19.988005.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-02T03-25-19.988005.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T03-25-19.988005.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-02T03-25-19.988005.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T03-25-19.988005.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T03-25-19.988005.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T03-25-19.988005.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-02T03-25-19.988005.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-02T03-25-19.988005.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T03-25-19.988005.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T03-25-19.988005.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T03-25-19.988005.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T03-25-19.988005.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-02T03-25-19.988005.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-02T03-25-19.988005.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-02T03-25-19.988005.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T03-25-19.988005.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-02T03-25-19.988005.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T03-25-19.988005.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T03-25-19.988005.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-02T03-25-19.988005.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-02T03-25-19.988005.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-02T03-25-19.988005.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T03-25-19.988005.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-02T03-25-19.988005.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-02T03-25-19.988005.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_02T03_25_19.988005", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T03-25-19.988005.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T03-25-19.988005.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_02T03_25_19.988005", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-02T03-25-19.988005.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-02T03-25-19.988005.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_02T03_25_19.988005", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-02T03-25-19.988005.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-02T03-25-19.988005.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_02T03_25_19.988005", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T03-25-19.988005.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T03-25-19.988005.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_02T03_25_19.988005", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T03-25-19.988005.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T03-25-19.988005.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_02T03_25_19.988005", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-02T03-25-19.988005.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-02T03-25-19.988005.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_02T03_25_19.988005", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T03-25-19.988005.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T03-25-19.988005.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_02T03_25_19.988005", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T03-25-19.988005.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T03-25-19.988005.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_02T03_25_19.988005", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T03-25-19.988005.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T03-25-19.988005.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_02T03_25_19.988005", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T03-25-19.988005.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T03-25-19.988005.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_02T03_25_19.988005", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-02T03-25-19.988005.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-02T03-25-19.988005.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_02T03_25_19.988005", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-02T03-25-19.988005.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-02T03-25-19.988005.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_02T03_25_19.988005", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T03-25-19.988005.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T03-25-19.988005.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_02T03_25_19.988005", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-02T03-25-19.988005.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-02T03-25-19.988005.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_02T03_25_19.988005", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T03-25-19.988005.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T03-25-19.988005.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_02T03_25_19.988005", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T03-25-19.988005.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T03-25-19.988005.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_02T03_25_19.988005", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T03-25-19.988005.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T03-25-19.988005.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_02T03_25_19.988005", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-02T03-25-19.988005.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-02T03-25-19.988005.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_02T03_25_19.988005", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T03-25-19.988005.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T03-25-19.988005.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_02T03_25_19.988005", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T03-25-19.988005.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T03-25-19.988005.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_02T03_25_19.988005", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T03-25-19.988005.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T03-25-19.988005.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_02T03_25_19.988005", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T03-25-19.988005.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T03-25-19.988005.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_02T03_25_19.988005", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T03-25-19.988005.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T03-25-19.988005.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_02T03_25_19.988005", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T03-25-19.988005.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T03-25-19.988005.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_02T03_25_19.988005", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T03-25-19.988005.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T03-25-19.988005.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_02T03_25_19.988005", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T03-25-19.988005.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T03-25-19.988005.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_02T03_25_19.988005", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T03-25-19.988005.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T03-25-19.988005.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_02T03_25_19.988005", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T03-25-19.988005.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T03-25-19.988005.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_02T03_25_19.988005", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T03-25-19.988005.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T03-25-19.988005.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_02T03_25_19.988005", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T03-25-19.988005.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T03-25-19.988005.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_02T03_25_19.988005", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T03-25-19.988005.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T03-25-19.988005.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_02T03_25_19.988005", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T03-25-19.988005.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T03-25-19.988005.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_02T03_25_19.988005", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-02T03-25-19.988005.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-02T03-25-19.988005.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_02T03_25_19.988005", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T03-25-19.988005.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T03-25-19.988005.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_02T03_25_19.988005", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-02T03-25-19.988005.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-02T03-25-19.988005.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_02T03_25_19.988005", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T03-25-19.988005.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T03-25-19.988005.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_02T03_25_19.988005", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T03-25-19.988005.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T03-25-19.988005.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_02T03_25_19.988005", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T03-25-19.988005.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T03-25-19.988005.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_02T03_25_19.988005", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-02T03-25-19.988005.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-02T03-25-19.988005.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_02T03_25_19.988005", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-02T03-25-19.988005.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-02T03-25-19.988005.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_02T03_25_19.988005", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T03-25-19.988005.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T03-25-19.988005.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_02T03_25_19.988005", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T03-25-19.988005.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T03-25-19.988005.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_02T03_25_19.988005", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T03-25-19.988005.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T03-25-19.988005.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_02T03_25_19.988005", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T03-25-19.988005.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T03-25-19.988005.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_02T03_25_19.988005", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-02T03-25-19.988005.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-02T03-25-19.988005.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_02T03_25_19.988005", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-02T03-25-19.988005.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-02T03-25-19.988005.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_02T03_25_19.988005", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-02T03-25-19.988005.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-02T03-25-19.988005.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_02T03_25_19.988005", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T03-25-19.988005.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T03-25-19.988005.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_02T03_25_19.988005", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-02T03-25-19.988005.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-02T03-25-19.988005.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_02T03_25_19.988005", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T03-25-19.988005.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T03-25-19.988005.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_02T03_25_19.988005", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T03-25-19.988005.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T03-25-19.988005.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_02T03_25_19.988005", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-02T03-25-19.988005.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-02T03-25-19.988005.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_02T03_25_19.988005", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-02T03-25-19.988005.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-02T03-25-19.988005.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_02T03_25_19.988005", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-02T03-25-19.988005.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-02T03-25-19.988005.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_02T03_25_19.988005", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T03-25-19.988005.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T03-25-19.988005.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_02T03_25_19.988005", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-02T03-25-19.988005.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-02T03-25-19.988005.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_02T03_25_19.988005", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-02T03-25-19.988005.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-02T03-25-19.988005.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_02T03_25_19.988005", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-02T03-25-19.988005.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-02T03-25-19.988005.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_02T03_25_19.988005", "path": ["**/details_harness|winogrande|5_2024-02-02T03-25-19.988005.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-02T03-25-19.988005.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_02T03_25_19.988005", "path": ["results_2024-02-02T03-25-19.988005.parquet"]}, {"split": "latest", "path": ["results_2024-02-02T03-25-19.988005.parquet"]}]}]}
2024-02-02T03:28:09+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of Samee-ur/NeuralPipe-7B-slerp Dataset automatically created during the evaluation run of model Samee-ur/NeuralPipe-7B-slerp on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-02-02T03:25:19.988005(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of Samee-ur/NeuralPipe-7B-slerp\n\n\n\nDataset automatically created during the evaluation run of model Samee-ur/NeuralPipe-7B-slerp on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-02-02T03:25:19.988005(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of Samee-ur/NeuralPipe-7B-slerp\n\n\n\nDataset automatically created during the evaluation run of model Samee-ur/NeuralPipe-7B-slerp on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-02-02T03:25:19.988005(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
9a4a6f680f195a996455221e987331d95761ce26
# Dataset Card for Evaluation run of adamo1139/Yi-34B-200K-AEZAKMI-RAW-2901 <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [adamo1139/Yi-34B-200K-AEZAKMI-RAW-2901](https://huggingface.co/adamo1139/Yi-34B-200K-AEZAKMI-RAW-2901) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_adamo1139__Yi-34B-200K-AEZAKMI-RAW-2901", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-02-02T03:25:38.031965](https://huggingface.co/datasets/open-llm-leaderboard/details_adamo1139__Yi-34B-200K-AEZAKMI-RAW-2901/blob/main/results_2024-02-02T03-25-38.031965.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.7324210063058885, "acc_stderr": 0.029359152172342876, "acc_norm": 0.7374002642900267, "acc_norm_stderr": 0.029909325332768254, "mc1": 0.39167686658506734, "mc1_stderr": 0.017087795881769625, "mc2": 0.5509212115856775, "mc2_stderr": 0.015260405621718143 }, "harness|arc:challenge|25": { "acc": 0.6296928327645052, "acc_stderr": 0.01411129875167495, "acc_norm": 0.6493174061433447, "acc_norm_stderr": 0.013944635930726099 }, "harness|hellaswag|10": { "acc": 0.6521609241187014, "acc_stderr": 0.0047531124327286995, "acc_norm": 0.8498307110137423, "acc_norm_stderr": 0.0035650718701954478 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.43, "acc_stderr": 0.04975698519562428, "acc_norm": 0.43, "acc_norm_stderr": 0.04975698519562428 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.7037037037037037, "acc_stderr": 0.03944624162501116, "acc_norm": 0.7037037037037037, "acc_norm_stderr": 0.03944624162501116 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.8157894736842105, "acc_stderr": 0.0315469804508223, "acc_norm": 0.8157894736842105, "acc_norm_stderr": 0.0315469804508223 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.79, "acc_stderr": 0.040936018074033256, "acc_norm": 0.79, "acc_norm_stderr": 0.040936018074033256 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.7924528301886793, "acc_stderr": 0.02495991802891127, "acc_norm": 0.7924528301886793, "acc_norm_stderr": 0.02495991802891127 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.8680555555555556, "acc_stderr": 0.02830096838204443, "acc_norm": 0.8680555555555556, "acc_norm_stderr": 0.02830096838204443 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.5, "acc_stderr": 0.050251890762960605, "acc_norm": 0.5, "acc_norm_stderr": 0.050251890762960605 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.58, "acc_stderr": 0.049604496374885836, "acc_norm": 0.58, "acc_norm_stderr": 0.049604496374885836 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.4, "acc_stderr": 0.049236596391733084, "acc_norm": 0.4, "acc_norm_stderr": 0.049236596391733084 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6705202312138728, "acc_stderr": 0.03583901754736412, "acc_norm": 0.6705202312138728, "acc_norm_stderr": 0.03583901754736412 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.5196078431372549, "acc_stderr": 0.04971358884367406, "acc_norm": 0.5196078431372549, "acc_norm_stderr": 0.04971358884367406 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.79, "acc_stderr": 0.04093601807403326, "acc_norm": 0.79, "acc_norm_stderr": 0.04093601807403326 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.7617021276595745, "acc_stderr": 0.027851252973889778, "acc_norm": 0.7617021276595745, "acc_norm_stderr": 0.027851252973889778 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.5526315789473685, "acc_stderr": 0.04677473004491199, "acc_norm": 0.5526315789473685, "acc_norm_stderr": 0.04677473004491199 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.7586206896551724, "acc_stderr": 0.03565998174135302, "acc_norm": 0.7586206896551724, "acc_norm_stderr": 0.03565998174135302 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.6402116402116402, "acc_stderr": 0.02471807594412928, "acc_norm": 0.6402116402116402, "acc_norm_stderr": 0.02471807594412928 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.49206349206349204, "acc_stderr": 0.044715725362943486, "acc_norm": 0.49206349206349204, "acc_norm_stderr": 0.044715725362943486 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.47, "acc_stderr": 0.050161355804659205, "acc_norm": 0.47, "acc_norm_stderr": 0.050161355804659205 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.8806451612903226, "acc_stderr": 0.01844341132531541, "acc_norm": 0.8806451612903226, "acc_norm_stderr": 0.01844341132531541 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.6403940886699507, "acc_stderr": 0.03376458246509567, "acc_norm": 0.6403940886699507, "acc_norm_stderr": 0.03376458246509567 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.8, "acc_stderr": 0.040201512610368445, "acc_norm": 0.8, "acc_norm_stderr": 0.040201512610368445 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.8363636363636363, "acc_stderr": 0.02888787239548795, "acc_norm": 0.8363636363636363, "acc_norm_stderr": 0.02888787239548795 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.898989898989899, "acc_stderr": 0.02146973557605535, "acc_norm": 0.898989898989899, "acc_norm_stderr": 0.02146973557605535 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.9637305699481865, "acc_stderr": 0.013492659751295145, "acc_norm": 0.9637305699481865, "acc_norm_stderr": 0.013492659751295145 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.7692307692307693, "acc_stderr": 0.021362027725222704, "acc_norm": 0.7692307692307693, "acc_norm_stderr": 0.021362027725222704 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.4074074074074074, "acc_stderr": 0.02995824925008211, "acc_norm": 0.4074074074074074, "acc_norm_stderr": 0.02995824925008211 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.819327731092437, "acc_stderr": 0.024991964966600756, "acc_norm": 0.819327731092437, "acc_norm_stderr": 0.024991964966600756 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.44370860927152317, "acc_stderr": 0.04056527902281732, "acc_norm": 0.44370860927152317, "acc_norm_stderr": 0.04056527902281732 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.9119266055045872, "acc_stderr": 0.012150743719481655, "acc_norm": 0.9119266055045872, "acc_norm_stderr": 0.012150743719481655 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.6666666666666666, "acc_stderr": 0.03214952147802749, "acc_norm": 0.6666666666666666, "acc_norm_stderr": 0.03214952147802749 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.8872549019607843, "acc_stderr": 0.02219857103945681, "acc_norm": 0.8872549019607843, "acc_norm_stderr": 0.02219857103945681 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.8860759493670886, "acc_stderr": 0.020681745135884562, "acc_norm": 0.8860759493670886, "acc_norm_stderr": 0.020681745135884562 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.7219730941704036, "acc_stderr": 0.030069584874494043, "acc_norm": 0.7219730941704036, "acc_norm_stderr": 0.030069584874494043 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.8091603053435115, "acc_stderr": 0.03446513350752599, "acc_norm": 0.8091603053435115, "acc_norm_stderr": 0.03446513350752599 }, "harness|hendrycksTest-international_law|5": { "acc": 0.8760330578512396, "acc_stderr": 0.030083098716035196, "acc_norm": 0.8760330578512396, "acc_norm_stderr": 0.030083098716035196 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.8796296296296297, "acc_stderr": 0.03145703854306252, "acc_norm": 0.8796296296296297, "acc_norm_stderr": 0.03145703854306252 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.8773006134969326, "acc_stderr": 0.025777328426978927, "acc_norm": 0.8773006134969326, "acc_norm_stderr": 0.025777328426978927 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.5089285714285714, "acc_stderr": 0.04745033255489123, "acc_norm": 0.5089285714285714, "acc_norm_stderr": 0.04745033255489123 }, "harness|hendrycksTest-management|5": { "acc": 0.8737864077669902, "acc_stderr": 0.03288180278808628, "acc_norm": 0.8737864077669902, "acc_norm_stderr": 0.03288180278808628 }, "harness|hendrycksTest-marketing|5": { "acc": 0.9188034188034188, "acc_stderr": 0.01789378490401854, "acc_norm": 0.9188034188034188, "acc_norm_stderr": 0.01789378490401854 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.86, "acc_stderr": 0.03487350880197769, "acc_norm": 0.86, "acc_norm_stderr": 0.03487350880197769 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8901660280970626, "acc_stderr": 0.01118151050324705, "acc_norm": 0.8901660280970626, "acc_norm_stderr": 0.01118151050324705 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.8005780346820809, "acc_stderr": 0.021511900654252538, "acc_norm": 0.8005780346820809, "acc_norm_stderr": 0.021511900654252538 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.7016759776536313, "acc_stderr": 0.01530184004512928, "acc_norm": 0.7016759776536313, "acc_norm_stderr": 0.01530184004512928 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.8104575163398693, "acc_stderr": 0.0224423582633362, "acc_norm": 0.8104575163398693, "acc_norm_stderr": 0.0224423582633362 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.77491961414791, "acc_stderr": 0.023720088516179027, "acc_norm": 0.77491961414791, "acc_norm_stderr": 0.023720088516179027 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.8395061728395061, "acc_stderr": 0.020423955354778034, "acc_norm": 0.8395061728395061, "acc_norm_stderr": 0.020423955354778034 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.6063829787234043, "acc_stderr": 0.029144544781596154, "acc_norm": 0.6063829787234043, "acc_norm_stderr": 0.029144544781596154 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.5619295958279009, "acc_stderr": 0.012671902782567641, "acc_norm": 0.5619295958279009, "acc_norm_stderr": 0.012671902782567641 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.7794117647058824, "acc_stderr": 0.025187786660227262, "acc_norm": 0.7794117647058824, "acc_norm_stderr": 0.025187786660227262 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.7728758169934641, "acc_stderr": 0.016949853279212376, "acc_norm": 0.7728758169934641, "acc_norm_stderr": 0.016949853279212376 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.7181818181818181, "acc_stderr": 0.043091187099464585, "acc_norm": 0.7181818181818181, "acc_norm_stderr": 0.043091187099464585 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.8367346938775511, "acc_stderr": 0.02366169917709861, "acc_norm": 0.8367346938775511, "acc_norm_stderr": 0.02366169917709861 }, "harness|hendrycksTest-sociology|5": { "acc": 0.8805970149253731, "acc_stderr": 0.02292879327721974, "acc_norm": 0.8805970149253731, "acc_norm_stderr": 0.02292879327721974 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.9, "acc_stderr": 0.030151134457776334, "acc_norm": 0.9, "acc_norm_stderr": 0.030151134457776334 }, "harness|hendrycksTest-virology|5": { "acc": 0.5783132530120482, "acc_stderr": 0.03844453181770917, "acc_norm": 0.5783132530120482, "acc_norm_stderr": 0.03844453181770917 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8830409356725146, "acc_stderr": 0.024648068961366152, "acc_norm": 0.8830409356725146, "acc_norm_stderr": 0.024648068961366152 }, "harness|truthfulqa:mc|0": { "mc1": 0.39167686658506734, "mc1_stderr": 0.017087795881769625, "mc2": 0.5509212115856775, "mc2_stderr": 0.015260405621718143 }, "harness|winogrande|5": { "acc": 0.7932123125493291, "acc_stderr": 0.011382566829235812 }, "harness|gsm8k|5": { "acc": 0.5951478392721758, "acc_stderr": 0.013520817666870506 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_adamo1139__Yi-34B-200K-AEZAKMI-RAW-2901
[ "region:us" ]
2024-02-02T03:27:51+00:00
{"pretty_name": "Evaluation run of adamo1139/Yi-34B-200K-AEZAKMI-RAW-2901", "dataset_summary": "Dataset automatically created during the evaluation run of model [adamo1139/Yi-34B-200K-AEZAKMI-RAW-2901](https://huggingface.co/adamo1139/Yi-34B-200K-AEZAKMI-RAW-2901) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_adamo1139__Yi-34B-200K-AEZAKMI-RAW-2901\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-02T03:25:38.031965](https://huggingface.co/datasets/open-llm-leaderboard/details_adamo1139__Yi-34B-200K-AEZAKMI-RAW-2901/blob/main/results_2024-02-02T03-25-38.031965.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7324210063058885,\n \"acc_stderr\": 0.029359152172342876,\n \"acc_norm\": 0.7374002642900267,\n \"acc_norm_stderr\": 0.029909325332768254,\n \"mc1\": 0.39167686658506734,\n \"mc1_stderr\": 0.017087795881769625,\n \"mc2\": 0.5509212115856775,\n \"mc2_stderr\": 0.015260405621718143\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6296928327645052,\n \"acc_stderr\": 0.01411129875167495,\n \"acc_norm\": 0.6493174061433447,\n \"acc_norm_stderr\": 0.013944635930726099\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6521609241187014,\n \"acc_stderr\": 0.0047531124327286995,\n \"acc_norm\": 0.8498307110137423,\n \"acc_norm_stderr\": 0.0035650718701954478\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.43,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.7037037037037037,\n \"acc_stderr\": 0.03944624162501116,\n \"acc_norm\": 0.7037037037037037,\n \"acc_norm_stderr\": 0.03944624162501116\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.8157894736842105,\n \"acc_stderr\": 0.0315469804508223,\n \"acc_norm\": 0.8157894736842105,\n \"acc_norm_stderr\": 0.0315469804508223\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.79,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7924528301886793,\n \"acc_stderr\": 0.02495991802891127,\n \"acc_norm\": 0.7924528301886793,\n \"acc_norm_stderr\": 0.02495991802891127\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8680555555555556,\n \"acc_stderr\": 0.02830096838204443,\n \"acc_norm\": 0.8680555555555556,\n \"acc_norm_stderr\": 0.02830096838204443\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6705202312138728,\n \"acc_stderr\": 0.03583901754736412,\n \"acc_norm\": 0.6705202312138728,\n \"acc_norm_stderr\": 0.03583901754736412\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.5196078431372549,\n \"acc_stderr\": 0.04971358884367406,\n \"acc_norm\": 0.5196078431372549,\n \"acc_norm_stderr\": 0.04971358884367406\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.79,\n \"acc_stderr\": 0.04093601807403326,\n \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.04093601807403326\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.7617021276595745,\n \"acc_stderr\": 0.027851252973889778,\n \"acc_norm\": 0.7617021276595745,\n \"acc_norm_stderr\": 0.027851252973889778\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5526315789473685,\n \"acc_stderr\": 0.04677473004491199,\n \"acc_norm\": 0.5526315789473685,\n \"acc_norm_stderr\": 0.04677473004491199\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.7586206896551724,\n \"acc_stderr\": 0.03565998174135302,\n \"acc_norm\": 0.7586206896551724,\n \"acc_norm_stderr\": 0.03565998174135302\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.6402116402116402,\n \"acc_stderr\": 0.02471807594412928,\n \"acc_norm\": 0.6402116402116402,\n \"acc_norm_stderr\": 0.02471807594412928\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.49206349206349204,\n \"acc_stderr\": 0.044715725362943486,\n \"acc_norm\": 0.49206349206349204,\n \"acc_norm_stderr\": 0.044715725362943486\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8806451612903226,\n \"acc_stderr\": 0.01844341132531541,\n \"acc_norm\": 0.8806451612903226,\n \"acc_norm_stderr\": 0.01844341132531541\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.6403940886699507,\n \"acc_stderr\": 0.03376458246509567,\n \"acc_norm\": 0.6403940886699507,\n \"acc_norm_stderr\": 0.03376458246509567\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.8,\n \"acc_stderr\": 0.040201512610368445,\n \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.040201512610368445\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.8363636363636363,\n \"acc_stderr\": 0.02888787239548795,\n \"acc_norm\": 0.8363636363636363,\n \"acc_norm_stderr\": 0.02888787239548795\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.898989898989899,\n \"acc_stderr\": 0.02146973557605535,\n \"acc_norm\": 0.898989898989899,\n \"acc_norm_stderr\": 0.02146973557605535\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9637305699481865,\n \"acc_stderr\": 0.013492659751295145,\n \"acc_norm\": 0.9637305699481865,\n \"acc_norm_stderr\": 0.013492659751295145\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.7692307692307693,\n \"acc_stderr\": 0.021362027725222704,\n \"acc_norm\": 0.7692307692307693,\n \"acc_norm_stderr\": 0.021362027725222704\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.4074074074074074,\n \"acc_stderr\": 0.02995824925008211,\n \"acc_norm\": 0.4074074074074074,\n \"acc_norm_stderr\": 0.02995824925008211\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.819327731092437,\n \"acc_stderr\": 0.024991964966600756,\n \"acc_norm\": 0.819327731092437,\n \"acc_norm_stderr\": 0.024991964966600756\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.44370860927152317,\n \"acc_stderr\": 0.04056527902281732,\n \"acc_norm\": 0.44370860927152317,\n \"acc_norm_stderr\": 0.04056527902281732\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.9119266055045872,\n \"acc_stderr\": 0.012150743719481655,\n \"acc_norm\": 0.9119266055045872,\n \"acc_norm_stderr\": 0.012150743719481655\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.03214952147802749,\n \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.03214952147802749\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8872549019607843,\n \"acc_stderr\": 0.02219857103945681,\n \"acc_norm\": 0.8872549019607843,\n \"acc_norm_stderr\": 0.02219857103945681\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8860759493670886,\n \"acc_stderr\": 0.020681745135884562,\n \"acc_norm\": 0.8860759493670886,\n \"acc_norm_stderr\": 0.020681745135884562\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7219730941704036,\n \"acc_stderr\": 0.030069584874494043,\n \"acc_norm\": 0.7219730941704036,\n \"acc_norm_stderr\": 0.030069584874494043\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8091603053435115,\n \"acc_stderr\": 0.03446513350752599,\n \"acc_norm\": 0.8091603053435115,\n \"acc_norm_stderr\": 0.03446513350752599\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8760330578512396,\n \"acc_stderr\": 0.030083098716035196,\n \"acc_norm\": 0.8760330578512396,\n \"acc_norm_stderr\": 0.030083098716035196\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8796296296296297,\n \"acc_stderr\": 0.03145703854306252,\n \"acc_norm\": 0.8796296296296297,\n \"acc_norm_stderr\": 0.03145703854306252\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.8773006134969326,\n \"acc_stderr\": 0.025777328426978927,\n \"acc_norm\": 0.8773006134969326,\n \"acc_norm_stderr\": 0.025777328426978927\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5089285714285714,\n \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.5089285714285714,\n \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8737864077669902,\n \"acc_stderr\": 0.03288180278808628,\n \"acc_norm\": 0.8737864077669902,\n \"acc_norm_stderr\": 0.03288180278808628\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9188034188034188,\n \"acc_stderr\": 0.01789378490401854,\n \"acc_norm\": 0.9188034188034188,\n \"acc_norm_stderr\": 0.01789378490401854\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.86,\n \"acc_stderr\": 0.03487350880197769,\n \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.03487350880197769\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8901660280970626,\n \"acc_stderr\": 0.01118151050324705,\n \"acc_norm\": 0.8901660280970626,\n \"acc_norm_stderr\": 0.01118151050324705\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.8005780346820809,\n \"acc_stderr\": 0.021511900654252538,\n \"acc_norm\": 0.8005780346820809,\n \"acc_norm_stderr\": 0.021511900654252538\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.7016759776536313,\n \"acc_stderr\": 0.01530184004512928,\n \"acc_norm\": 0.7016759776536313,\n \"acc_norm_stderr\": 0.01530184004512928\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.8104575163398693,\n \"acc_stderr\": 0.0224423582633362,\n \"acc_norm\": 0.8104575163398693,\n \"acc_norm_stderr\": 0.0224423582633362\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.77491961414791,\n \"acc_stderr\": 0.023720088516179027,\n \"acc_norm\": 0.77491961414791,\n \"acc_norm_stderr\": 0.023720088516179027\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.8395061728395061,\n \"acc_stderr\": 0.020423955354778034,\n \"acc_norm\": 0.8395061728395061,\n \"acc_norm_stderr\": 0.020423955354778034\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.6063829787234043,\n \"acc_stderr\": 0.029144544781596154,\n \"acc_norm\": 0.6063829787234043,\n \"acc_norm_stderr\": 0.029144544781596154\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5619295958279009,\n \"acc_stderr\": 0.012671902782567641,\n \"acc_norm\": 0.5619295958279009,\n \"acc_norm_stderr\": 0.012671902782567641\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.7794117647058824,\n \"acc_stderr\": 0.025187786660227262,\n \"acc_norm\": 0.7794117647058824,\n \"acc_norm_stderr\": 0.025187786660227262\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.7728758169934641,\n \"acc_stderr\": 0.016949853279212376,\n \"acc_norm\": 0.7728758169934641,\n \"acc_norm_stderr\": 0.016949853279212376\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7181818181818181,\n \"acc_stderr\": 0.043091187099464585,\n \"acc_norm\": 0.7181818181818181,\n \"acc_norm_stderr\": 0.043091187099464585\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.8367346938775511,\n \"acc_stderr\": 0.02366169917709861,\n \"acc_norm\": 0.8367346938775511,\n \"acc_norm_stderr\": 0.02366169917709861\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8805970149253731,\n \"acc_stderr\": 0.02292879327721974,\n \"acc_norm\": 0.8805970149253731,\n \"acc_norm_stderr\": 0.02292879327721974\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.9,\n \"acc_stderr\": 0.030151134457776334,\n \"acc_norm\": 0.9,\n \"acc_norm_stderr\": 0.030151134457776334\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5783132530120482,\n \"acc_stderr\": 0.03844453181770917,\n \"acc_norm\": 0.5783132530120482,\n \"acc_norm_stderr\": 0.03844453181770917\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8830409356725146,\n \"acc_stderr\": 0.024648068961366152,\n \"acc_norm\": 0.8830409356725146,\n \"acc_norm_stderr\": 0.024648068961366152\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.39167686658506734,\n \"mc1_stderr\": 0.017087795881769625,\n \"mc2\": 0.5509212115856775,\n \"mc2_stderr\": 0.015260405621718143\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7932123125493291,\n \"acc_stderr\": 0.011382566829235812\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5951478392721758,\n \"acc_stderr\": 0.013520817666870506\n }\n}\n```", "repo_url": "https://huggingface.co/adamo1139/Yi-34B-200K-AEZAKMI-RAW-2901", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_02T03_25_38.031965", "path": ["**/details_harness|arc:challenge|25_2024-02-02T03-25-38.031965.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-02T03-25-38.031965.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_02T03_25_38.031965", "path": ["**/details_harness|gsm8k|5_2024-02-02T03-25-38.031965.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-02T03-25-38.031965.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_02T03_25_38.031965", "path": ["**/details_harness|hellaswag|10_2024-02-02T03-25-38.031965.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-02T03-25-38.031965.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_02T03_25_38.031965", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T03-25-38.031965.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-02T03-25-38.031965.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-02T03-25-38.031965.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T03-25-38.031965.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T03-25-38.031965.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-02T03-25-38.031965.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T03-25-38.031965.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T03-25-38.031965.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T03-25-38.031965.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T03-25-38.031965.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-02T03-25-38.031965.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-02T03-25-38.031965.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T03-25-38.031965.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-02T03-25-38.031965.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T03-25-38.031965.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T03-25-38.031965.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T03-25-38.031965.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-02T03-25-38.031965.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T03-25-38.031965.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T03-25-38.031965.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T03-25-38.031965.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T03-25-38.031965.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T03-25-38.031965.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T03-25-38.031965.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T03-25-38.031965.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T03-25-38.031965.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T03-25-38.031965.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T03-25-38.031965.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T03-25-38.031965.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T03-25-38.031965.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T03-25-38.031965.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T03-25-38.031965.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-02T03-25-38.031965.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T03-25-38.031965.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-02T03-25-38.031965.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T03-25-38.031965.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T03-25-38.031965.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T03-25-38.031965.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-02T03-25-38.031965.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-02T03-25-38.031965.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T03-25-38.031965.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T03-25-38.031965.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T03-25-38.031965.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T03-25-38.031965.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-02T03-25-38.031965.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-02T03-25-38.031965.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-02T03-25-38.031965.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T03-25-38.031965.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-02T03-25-38.031965.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T03-25-38.031965.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T03-25-38.031965.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-02T03-25-38.031965.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-02T03-25-38.031965.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-02T03-25-38.031965.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T03-25-38.031965.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-02T03-25-38.031965.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-02T03-25-38.031965.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T03-25-38.031965.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-02T03-25-38.031965.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-02T03-25-38.031965.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T03-25-38.031965.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T03-25-38.031965.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-02T03-25-38.031965.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T03-25-38.031965.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T03-25-38.031965.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T03-25-38.031965.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T03-25-38.031965.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-02T03-25-38.031965.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-02T03-25-38.031965.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T03-25-38.031965.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-02T03-25-38.031965.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T03-25-38.031965.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T03-25-38.031965.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T03-25-38.031965.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-02T03-25-38.031965.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T03-25-38.031965.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T03-25-38.031965.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T03-25-38.031965.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T03-25-38.031965.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T03-25-38.031965.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T03-25-38.031965.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T03-25-38.031965.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T03-25-38.031965.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T03-25-38.031965.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T03-25-38.031965.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T03-25-38.031965.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T03-25-38.031965.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T03-25-38.031965.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T03-25-38.031965.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-02T03-25-38.031965.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T03-25-38.031965.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-02T03-25-38.031965.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T03-25-38.031965.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T03-25-38.031965.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T03-25-38.031965.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-02T03-25-38.031965.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-02T03-25-38.031965.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T03-25-38.031965.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T03-25-38.031965.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T03-25-38.031965.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T03-25-38.031965.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-02T03-25-38.031965.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-02T03-25-38.031965.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-02T03-25-38.031965.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T03-25-38.031965.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-02T03-25-38.031965.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T03-25-38.031965.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T03-25-38.031965.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-02T03-25-38.031965.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-02T03-25-38.031965.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-02T03-25-38.031965.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T03-25-38.031965.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-02T03-25-38.031965.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-02T03-25-38.031965.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_02T03_25_38.031965", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T03-25-38.031965.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T03-25-38.031965.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_02T03_25_38.031965", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-02T03-25-38.031965.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-02T03-25-38.031965.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_02T03_25_38.031965", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-02T03-25-38.031965.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-02T03-25-38.031965.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_02T03_25_38.031965", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T03-25-38.031965.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T03-25-38.031965.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_02T03_25_38.031965", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T03-25-38.031965.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T03-25-38.031965.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_02T03_25_38.031965", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-02T03-25-38.031965.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-02T03-25-38.031965.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_02T03_25_38.031965", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T03-25-38.031965.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T03-25-38.031965.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_02T03_25_38.031965", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T03-25-38.031965.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T03-25-38.031965.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_02T03_25_38.031965", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T03-25-38.031965.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T03-25-38.031965.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_02T03_25_38.031965", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T03-25-38.031965.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T03-25-38.031965.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_02T03_25_38.031965", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-02T03-25-38.031965.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-02T03-25-38.031965.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_02T03_25_38.031965", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-02T03-25-38.031965.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-02T03-25-38.031965.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_02T03_25_38.031965", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T03-25-38.031965.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T03-25-38.031965.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_02T03_25_38.031965", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-02T03-25-38.031965.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-02T03-25-38.031965.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_02T03_25_38.031965", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T03-25-38.031965.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T03-25-38.031965.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_02T03_25_38.031965", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T03-25-38.031965.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T03-25-38.031965.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_02T03_25_38.031965", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T03-25-38.031965.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T03-25-38.031965.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_02T03_25_38.031965", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-02T03-25-38.031965.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-02T03-25-38.031965.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_02T03_25_38.031965", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T03-25-38.031965.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T03-25-38.031965.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_02T03_25_38.031965", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T03-25-38.031965.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T03-25-38.031965.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_02T03_25_38.031965", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T03-25-38.031965.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T03-25-38.031965.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_02T03_25_38.031965", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T03-25-38.031965.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T03-25-38.031965.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_02T03_25_38.031965", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T03-25-38.031965.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T03-25-38.031965.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_02T03_25_38.031965", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T03-25-38.031965.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T03-25-38.031965.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_02T03_25_38.031965", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T03-25-38.031965.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T03-25-38.031965.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_02T03_25_38.031965", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T03-25-38.031965.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T03-25-38.031965.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_02T03_25_38.031965", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T03-25-38.031965.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T03-25-38.031965.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_02T03_25_38.031965", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T03-25-38.031965.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T03-25-38.031965.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_02T03_25_38.031965", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T03-25-38.031965.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T03-25-38.031965.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_02T03_25_38.031965", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T03-25-38.031965.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T03-25-38.031965.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_02T03_25_38.031965", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T03-25-38.031965.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T03-25-38.031965.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_02T03_25_38.031965", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T03-25-38.031965.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T03-25-38.031965.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_02T03_25_38.031965", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-02T03-25-38.031965.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-02T03-25-38.031965.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_02T03_25_38.031965", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T03-25-38.031965.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T03-25-38.031965.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_02T03_25_38.031965", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-02T03-25-38.031965.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-02T03-25-38.031965.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_02T03_25_38.031965", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T03-25-38.031965.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T03-25-38.031965.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_02T03_25_38.031965", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T03-25-38.031965.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T03-25-38.031965.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_02T03_25_38.031965", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T03-25-38.031965.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T03-25-38.031965.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_02T03_25_38.031965", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-02T03-25-38.031965.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-02T03-25-38.031965.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_02T03_25_38.031965", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-02T03-25-38.031965.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-02T03-25-38.031965.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_02T03_25_38.031965", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T03-25-38.031965.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T03-25-38.031965.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_02T03_25_38.031965", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T03-25-38.031965.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T03-25-38.031965.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_02T03_25_38.031965", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T03-25-38.031965.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T03-25-38.031965.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_02T03_25_38.031965", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T03-25-38.031965.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T03-25-38.031965.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_02T03_25_38.031965", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-02T03-25-38.031965.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-02T03-25-38.031965.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_02T03_25_38.031965", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-02T03-25-38.031965.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-02T03-25-38.031965.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_02T03_25_38.031965", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-02T03-25-38.031965.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-02T03-25-38.031965.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_02T03_25_38.031965", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T03-25-38.031965.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T03-25-38.031965.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_02T03_25_38.031965", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-02T03-25-38.031965.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-02T03-25-38.031965.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_02T03_25_38.031965", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T03-25-38.031965.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T03-25-38.031965.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_02T03_25_38.031965", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T03-25-38.031965.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T03-25-38.031965.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_02T03_25_38.031965", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-02T03-25-38.031965.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-02T03-25-38.031965.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_02T03_25_38.031965", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-02T03-25-38.031965.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-02T03-25-38.031965.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_02T03_25_38.031965", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-02T03-25-38.031965.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-02T03-25-38.031965.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_02T03_25_38.031965", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T03-25-38.031965.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T03-25-38.031965.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_02T03_25_38.031965", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-02T03-25-38.031965.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-02T03-25-38.031965.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_02T03_25_38.031965", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-02T03-25-38.031965.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-02T03-25-38.031965.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_02T03_25_38.031965", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-02T03-25-38.031965.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-02T03-25-38.031965.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_02T03_25_38.031965", "path": ["**/details_harness|winogrande|5_2024-02-02T03-25-38.031965.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-02T03-25-38.031965.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_02T03_25_38.031965", "path": ["results_2024-02-02T03-25-38.031965.parquet"]}, {"split": "latest", "path": ["results_2024-02-02T03-25-38.031965.parquet"]}]}]}
2024-02-02T03:28:21+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of adamo1139/Yi-34B-200K-AEZAKMI-RAW-2901 Dataset automatically created during the evaluation run of model adamo1139/Yi-34B-200K-AEZAKMI-RAW-2901 on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-02-02T03:25:38.031965(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of adamo1139/Yi-34B-200K-AEZAKMI-RAW-2901\n\n\n\nDataset automatically created during the evaluation run of model adamo1139/Yi-34B-200K-AEZAKMI-RAW-2901 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-02-02T03:25:38.031965(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of adamo1139/Yi-34B-200K-AEZAKMI-RAW-2901\n\n\n\nDataset automatically created during the evaluation run of model adamo1139/Yi-34B-200K-AEZAKMI-RAW-2901 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-02-02T03:25:38.031965(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
594d9f8bfe0f15cfd53a3b40f5da8f3e21170b1b
# Dataset Card for "metamathqa_formatted" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
dlibf/metamathqa_formatted
[ "region:us" ]
2024-02-02T03:28:02+00:00
{"configs": [{"config_name": "default", "data_files": [{"split": "train_sft", "path": "data/train_sft-*"}, {"split": "test_sft", "path": "data/test_sft-*"}]}], "dataset_info": {"features": [{"name": "messages", "list": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}], "splits": [{"name": "train_sft", "num_bytes": 294158314.68253165, "num_examples": 394900}, {"name": "test_sft", "num_bytes": 74489.31746835443, "num_examples": 100}], "download_size": 129446994, "dataset_size": 294232804.0}}
2024-02-02T03:28:27+00:00
[]
[]
TAGS #region-us
# Dataset Card for "metamathqa_formatted" More Information needed
[ "# Dataset Card for \"metamathqa_formatted\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"metamathqa_formatted\"\n\nMore Information needed" ]
0378418e51bada824d5957d201339d082b1b9386
# Dataset Card for "glaive-code-assistant" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
dlibf/glaive-code-assistant
[ "region:us" ]
2024-02-02T03:28:40+00:00
{"configs": [{"config_name": "default", "data_files": [{"split": "train_sft", "path": "data/train_sft-*"}, {"split": "test_sft", "path": "data/test_sft-*"}]}], "dataset_info": {"features": [{"name": "messages", "list": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}], "splits": [{"name": "train_sft", "num_bytes": 210616334.29604948, "num_examples": 136009}, {"name": "test_sft", "num_bytes": 154854.70395051024, "num_examples": 100}], "download_size": 102642844, "dataset_size": 210771189.0}}
2024-02-02T03:29:00+00:00
[]
[]
TAGS #region-us
# Dataset Card for "glaive-code-assistant" More Information needed
[ "# Dataset Card for \"glaive-code-assistant\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"glaive-code-assistant\"\n\nMore Information needed" ]
e4476561765b658800e8e05d667ab5636da356c6
# Dataset Card for Evaluation run of Weyaxi/Newton-OpenHermes-2.5-neural-chat-v3-3-Slerp <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [Weyaxi/Newton-OpenHermes-2.5-neural-chat-v3-3-Slerp](https://huggingface.co/Weyaxi/Newton-OpenHermes-2.5-neural-chat-v3-3-Slerp) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_Weyaxi__Newton-OpenHermes-2.5-neural-chat-v3-3-Slerp", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-02-02T03:42:19.232314](https://huggingface.co/datasets/open-llm-leaderboard/details_Weyaxi__Newton-OpenHermes-2.5-neural-chat-v3-3-Slerp/blob/main/results_2024-02-02T03-42-19.232314.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6530732061402786, "acc_stderr": 0.031986064565857564, "acc_norm": 0.6546095792380836, "acc_norm_stderr": 0.0326302871117009, "mc1": 0.3880048959608323, "mc1_stderr": 0.017058761501347972, "mc2": 0.5684120643866822, "mc2_stderr": 0.015214628002199675 }, "harness|arc:challenge|25": { "acc": 0.6467576791808873, "acc_stderr": 0.013967822714840055, "acc_norm": 0.6877133105802048, "acc_norm_stderr": 0.013542598541688065 }, "harness|hellaswag|10": { "acc": 0.655646285600478, "acc_stderr": 0.004741859753178433, "acc_norm": 0.8500298745269866, "acc_norm_stderr": 0.0035631244274585173 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.35, "acc_stderr": 0.047937248544110196, "acc_norm": 0.35, "acc_norm_stderr": 0.047937248544110196 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.6296296296296297, "acc_stderr": 0.041716541613545426, "acc_norm": 0.6296296296296297, "acc_norm_stderr": 0.041716541613545426 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.6907894736842105, "acc_stderr": 0.037610708698674805, "acc_norm": 0.6907894736842105, "acc_norm_stderr": 0.037610708698674805 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.66, "acc_stderr": 0.04760952285695237, "acc_norm": 0.66, "acc_norm_stderr": 0.04760952285695237 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.6981132075471698, "acc_stderr": 0.02825420034443866, "acc_norm": 0.6981132075471698, "acc_norm_stderr": 0.02825420034443866 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.7708333333333334, "acc_stderr": 0.03514697467862388, "acc_norm": 0.7708333333333334, "acc_norm_stderr": 0.03514697467862388 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.49, "acc_stderr": 0.05024183937956911, "acc_norm": 0.49, "acc_norm_stderr": 0.05024183937956911 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.55, "acc_stderr": 0.05, "acc_norm": 0.55, "acc_norm_stderr": 0.05 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.32, "acc_stderr": 0.04688261722621504, "acc_norm": 0.32, "acc_norm_stderr": 0.04688261722621504 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6763005780346821, "acc_stderr": 0.0356760379963917, "acc_norm": 0.6763005780346821, "acc_norm_stderr": 0.0356760379963917 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.45098039215686275, "acc_stderr": 0.049512182523962625, "acc_norm": 0.45098039215686275, "acc_norm_stderr": 0.049512182523962625 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.76, "acc_stderr": 0.04292346959909284, "acc_norm": 0.76, "acc_norm_stderr": 0.04292346959909284 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.5659574468085107, "acc_stderr": 0.03240038086792747, "acc_norm": 0.5659574468085107, "acc_norm_stderr": 0.03240038086792747 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.49122807017543857, "acc_stderr": 0.04702880432049615, "acc_norm": 0.49122807017543857, "acc_norm_stderr": 0.04702880432049615 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.6, "acc_stderr": 0.04082482904638629, "acc_norm": 0.6, "acc_norm_stderr": 0.04082482904638629 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.4312169312169312, "acc_stderr": 0.025506481698138208, "acc_norm": 0.4312169312169312, "acc_norm_stderr": 0.025506481698138208 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.4444444444444444, "acc_stderr": 0.04444444444444449, "acc_norm": 0.4444444444444444, "acc_norm_stderr": 0.04444444444444449 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.33, "acc_stderr": 0.047258156262526045, "acc_norm": 0.33, "acc_norm_stderr": 0.047258156262526045 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.7741935483870968, "acc_stderr": 0.023785577884181015, "acc_norm": 0.7741935483870968, "acc_norm_stderr": 0.023785577884181015 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.4975369458128079, "acc_stderr": 0.03517945038691063, "acc_norm": 0.4975369458128079, "acc_norm_stderr": 0.03517945038691063 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.7, "acc_stderr": 0.046056618647183814, "acc_norm": 0.7, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7636363636363637, "acc_stderr": 0.033175059300091826, "acc_norm": 0.7636363636363637, "acc_norm_stderr": 0.033175059300091826 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.797979797979798, "acc_stderr": 0.02860620428922987, "acc_norm": 0.797979797979798, "acc_norm_stderr": 0.02860620428922987 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.9015544041450777, "acc_stderr": 0.021500249576033467, "acc_norm": 0.9015544041450777, "acc_norm_stderr": 0.021500249576033467 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.6487179487179487, "acc_stderr": 0.024203665177902803, "acc_norm": 0.6487179487179487, "acc_norm_stderr": 0.024203665177902803 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.35185185185185186, "acc_stderr": 0.029116617606083018, "acc_norm": 0.35185185185185186, "acc_norm_stderr": 0.029116617606083018 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.680672268907563, "acc_stderr": 0.030283995525884396, "acc_norm": 0.680672268907563, "acc_norm_stderr": 0.030283995525884396 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.33112582781456956, "acc_stderr": 0.038425817186598696, "acc_norm": 0.33112582781456956, "acc_norm_stderr": 0.038425817186598696 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8513761467889909, "acc_stderr": 0.015251253773660834, "acc_norm": 0.8513761467889909, "acc_norm_stderr": 0.015251253773660834 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.5277777777777778, "acc_stderr": 0.0340470532865388, "acc_norm": 0.5277777777777778, "acc_norm_stderr": 0.0340470532865388 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.8333333333333334, "acc_stderr": 0.026156867523931045, "acc_norm": 0.8333333333333334, "acc_norm_stderr": 0.026156867523931045 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.8059071729957806, "acc_stderr": 0.025744902532290913, "acc_norm": 0.8059071729957806, "acc_norm_stderr": 0.025744902532290913 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.7040358744394619, "acc_stderr": 0.030636591348699803, "acc_norm": 0.7040358744394619, "acc_norm_stderr": 0.030636591348699803 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.816793893129771, "acc_stderr": 0.03392770926494733, "acc_norm": 0.816793893129771, "acc_norm_stderr": 0.03392770926494733 }, "harness|hendrycksTest-international_law|5": { "acc": 0.7768595041322314, "acc_stderr": 0.03800754475228733, "acc_norm": 0.7768595041322314, "acc_norm_stderr": 0.03800754475228733 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7777777777777778, "acc_stderr": 0.040191074725573483, "acc_norm": 0.7777777777777778, "acc_norm_stderr": 0.040191074725573483 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7607361963190185, "acc_stderr": 0.03351953879521271, "acc_norm": 0.7607361963190185, "acc_norm_stderr": 0.03351953879521271 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.45535714285714285, "acc_stderr": 0.047268355537191, "acc_norm": 0.45535714285714285, "acc_norm_stderr": 0.047268355537191 }, "harness|hendrycksTest-management|5": { "acc": 0.7864077669902912, "acc_stderr": 0.040580420156460344, "acc_norm": 0.7864077669902912, "acc_norm_stderr": 0.040580420156460344 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8931623931623932, "acc_stderr": 0.020237149008990932, "acc_norm": 0.8931623931623932, "acc_norm_stderr": 0.020237149008990932 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.76, "acc_stderr": 0.042923469599092816, "acc_norm": 0.76, "acc_norm_stderr": 0.042923469599092816 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8352490421455939, "acc_stderr": 0.013265346261323788, "acc_norm": 0.8352490421455939, "acc_norm_stderr": 0.013265346261323788 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.7427745664739884, "acc_stderr": 0.023532925431044287, "acc_norm": 0.7427745664739884, "acc_norm_stderr": 0.023532925431044287 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.4201117318435754, "acc_stderr": 0.016507671073256402, "acc_norm": 0.4201117318435754, "acc_norm_stderr": 0.016507671073256402 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.7483660130718954, "acc_stderr": 0.024848018263875195, "acc_norm": 0.7483660130718954, "acc_norm_stderr": 0.024848018263875195 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.7138263665594855, "acc_stderr": 0.025670259242188936, "acc_norm": 0.7138263665594855, "acc_norm_stderr": 0.025670259242188936 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.7530864197530864, "acc_stderr": 0.02399350170904211, "acc_norm": 0.7530864197530864, "acc_norm_stderr": 0.02399350170904211 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.4574468085106383, "acc_stderr": 0.02971928127223685, "acc_norm": 0.4574468085106383, "acc_norm_stderr": 0.02971928127223685 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.4791395045632334, "acc_stderr": 0.012759117066518012, "acc_norm": 0.4791395045632334, "acc_norm_stderr": 0.012759117066518012 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.7058823529411765, "acc_stderr": 0.027678468642144717, "acc_norm": 0.7058823529411765, "acc_norm_stderr": 0.027678468642144717 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6650326797385621, "acc_stderr": 0.01909422816700033, "acc_norm": 0.6650326797385621, "acc_norm_stderr": 0.01909422816700033 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6454545454545455, "acc_stderr": 0.045820048415054174, "acc_norm": 0.6454545454545455, "acc_norm_stderr": 0.045820048415054174 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.7428571428571429, "acc_stderr": 0.02797982353874455, "acc_norm": 0.7428571428571429, "acc_norm_stderr": 0.02797982353874455 }, "harness|hendrycksTest-sociology|5": { "acc": 0.8557213930348259, "acc_stderr": 0.02484575321230604, "acc_norm": 0.8557213930348259, "acc_norm_stderr": 0.02484575321230604 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.87, "acc_stderr": 0.03379976689896309, "acc_norm": 0.87, "acc_norm_stderr": 0.03379976689896309 }, "harness|hendrycksTest-virology|5": { "acc": 0.5301204819277109, "acc_stderr": 0.03885425420866767, "acc_norm": 0.5301204819277109, "acc_norm_stderr": 0.03885425420866767 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8128654970760234, "acc_stderr": 0.02991312723236804, "acc_norm": 0.8128654970760234, "acc_norm_stderr": 0.02991312723236804 }, "harness|truthfulqa:mc|0": { "mc1": 0.3880048959608323, "mc1_stderr": 0.017058761501347972, "mc2": 0.5684120643866822, "mc2_stderr": 0.015214628002199675 }, "harness|winogrande|5": { "acc": 0.8011049723756906, "acc_stderr": 0.011218629972515314 }, "harness|gsm8k|5": { "acc": 0.6497346474601972, "acc_stderr": 0.013140409455571276 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_Weyaxi__Newton-OpenHermes-2.5-neural-chat-v3-3-Slerp
[ "region:us" ]
2024-02-02T03:44:41+00:00
{"pretty_name": "Evaluation run of Weyaxi/Newton-OpenHermes-2.5-neural-chat-v3-3-Slerp", "dataset_summary": "Dataset automatically created during the evaluation run of model [Weyaxi/Newton-OpenHermes-2.5-neural-chat-v3-3-Slerp](https://huggingface.co/Weyaxi/Newton-OpenHermes-2.5-neural-chat-v3-3-Slerp) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Weyaxi__Newton-OpenHermes-2.5-neural-chat-v3-3-Slerp\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-02T03:42:19.232314](https://huggingface.co/datasets/open-llm-leaderboard/details_Weyaxi__Newton-OpenHermes-2.5-neural-chat-v3-3-Slerp/blob/main/results_2024-02-02T03-42-19.232314.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6530732061402786,\n \"acc_stderr\": 0.031986064565857564,\n \"acc_norm\": 0.6546095792380836,\n \"acc_norm_stderr\": 0.0326302871117009,\n \"mc1\": 0.3880048959608323,\n \"mc1_stderr\": 0.017058761501347972,\n \"mc2\": 0.5684120643866822,\n \"mc2_stderr\": 0.015214628002199675\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6467576791808873,\n \"acc_stderr\": 0.013967822714840055,\n \"acc_norm\": 0.6877133105802048,\n \"acc_norm_stderr\": 0.013542598541688065\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.655646285600478,\n \"acc_stderr\": 0.004741859753178433,\n \"acc_norm\": 0.8500298745269866,\n \"acc_norm_stderr\": 0.0035631244274585173\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6296296296296297,\n \"acc_stderr\": 0.041716541613545426,\n \"acc_norm\": 0.6296296296296297,\n \"acc_norm_stderr\": 0.041716541613545426\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6907894736842105,\n \"acc_stderr\": 0.037610708698674805,\n \"acc_norm\": 0.6907894736842105,\n \"acc_norm_stderr\": 0.037610708698674805\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\": 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6981132075471698,\n \"acc_stderr\": 0.02825420034443866,\n \"acc_norm\": 0.6981132075471698,\n \"acc_norm_stderr\": 0.02825420034443866\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7708333333333334,\n \"acc_stderr\": 0.03514697467862388,\n \"acc_norm\": 0.7708333333333334,\n \"acc_norm_stderr\": 0.03514697467862388\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956911,\n \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956911\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6763005780346821,\n \"acc_stderr\": 0.0356760379963917,\n \"acc_norm\": 0.6763005780346821,\n \"acc_norm_stderr\": 0.0356760379963917\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.45098039215686275,\n \"acc_stderr\": 0.049512182523962625,\n \"acc_norm\": 0.45098039215686275,\n \"acc_norm_stderr\": 0.049512182523962625\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.04292346959909284,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.04292346959909284\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5659574468085107,\n \"acc_stderr\": 0.03240038086792747,\n \"acc_norm\": 0.5659574468085107,\n \"acc_norm_stderr\": 0.03240038086792747\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.49122807017543857,\n \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.49122807017543857,\n \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.04082482904638629,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.04082482904638629\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.4312169312169312,\n \"acc_stderr\": 0.025506481698138208,\n \"acc_norm\": 0.4312169312169312,\n \"acc_norm_stderr\": 0.025506481698138208\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4444444444444444,\n \"acc_stderr\": 0.04444444444444449,\n \"acc_norm\": 0.4444444444444444,\n \"acc_norm_stderr\": 0.04444444444444449\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7741935483870968,\n \"acc_stderr\": 0.023785577884181015,\n \"acc_norm\": 0.7741935483870968,\n \"acc_norm_stderr\": 0.023785577884181015\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4975369458128079,\n \"acc_stderr\": 0.03517945038691063,\n \"acc_norm\": 0.4975369458128079,\n \"acc_norm_stderr\": 0.03517945038691063\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7636363636363637,\n \"acc_stderr\": 0.033175059300091826,\n \"acc_norm\": 0.7636363636363637,\n \"acc_norm_stderr\": 0.033175059300091826\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.797979797979798,\n \"acc_stderr\": 0.02860620428922987,\n \"acc_norm\": 0.797979797979798,\n \"acc_norm_stderr\": 0.02860620428922987\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9015544041450777,\n \"acc_stderr\": 0.021500249576033467,\n \"acc_norm\": 0.9015544041450777,\n \"acc_norm_stderr\": 0.021500249576033467\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6487179487179487,\n \"acc_stderr\": 0.024203665177902803,\n \"acc_norm\": 0.6487179487179487,\n \"acc_norm_stderr\": 0.024203665177902803\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.35185185185185186,\n \"acc_stderr\": 0.029116617606083018,\n \"acc_norm\": 0.35185185185185186,\n \"acc_norm_stderr\": 0.029116617606083018\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.680672268907563,\n \"acc_stderr\": 0.030283995525884396,\n \"acc_norm\": 0.680672268907563,\n \"acc_norm_stderr\": 0.030283995525884396\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8513761467889909,\n \"acc_stderr\": 0.015251253773660834,\n \"acc_norm\": 0.8513761467889909,\n \"acc_norm_stderr\": 0.015251253773660834\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5277777777777778,\n \"acc_stderr\": 0.0340470532865388,\n \"acc_norm\": 0.5277777777777778,\n \"acc_norm_stderr\": 0.0340470532865388\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8333333333333334,\n \"acc_stderr\": 0.026156867523931045,\n \"acc_norm\": 0.8333333333333334,\n \"acc_norm_stderr\": 0.026156867523931045\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8059071729957806,\n \"acc_stderr\": 0.025744902532290913,\n \"acc_norm\": 0.8059071729957806,\n \"acc_norm_stderr\": 0.025744902532290913\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7040358744394619,\n \"acc_stderr\": 0.030636591348699803,\n \"acc_norm\": 0.7040358744394619,\n \"acc_norm_stderr\": 0.030636591348699803\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.816793893129771,\n \"acc_stderr\": 0.03392770926494733,\n \"acc_norm\": 0.816793893129771,\n \"acc_norm_stderr\": 0.03392770926494733\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228733,\n \"acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228733\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.040191074725573483,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.040191074725573483\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7607361963190185,\n \"acc_stderr\": 0.03351953879521271,\n \"acc_norm\": 0.7607361963190185,\n \"acc_norm_stderr\": 0.03351953879521271\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.45535714285714285,\n \"acc_stderr\": 0.047268355537191,\n \"acc_norm\": 0.45535714285714285,\n \"acc_norm_stderr\": 0.047268355537191\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7864077669902912,\n \"acc_stderr\": 0.040580420156460344,\n \"acc_norm\": 0.7864077669902912,\n \"acc_norm_stderr\": 0.040580420156460344\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8931623931623932,\n \"acc_stderr\": 0.020237149008990932,\n \"acc_norm\": 0.8931623931623932,\n \"acc_norm_stderr\": 0.020237149008990932\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8352490421455939,\n \"acc_stderr\": 0.013265346261323788,\n \"acc_norm\": 0.8352490421455939,\n \"acc_norm_stderr\": 0.013265346261323788\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7427745664739884,\n \"acc_stderr\": 0.023532925431044287,\n \"acc_norm\": 0.7427745664739884,\n \"acc_norm_stderr\": 0.023532925431044287\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4201117318435754,\n \"acc_stderr\": 0.016507671073256402,\n \"acc_norm\": 0.4201117318435754,\n \"acc_norm_stderr\": 0.016507671073256402\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7483660130718954,\n \"acc_stderr\": 0.024848018263875195,\n \"acc_norm\": 0.7483660130718954,\n \"acc_norm_stderr\": 0.024848018263875195\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7138263665594855,\n \"acc_stderr\": 0.025670259242188936,\n \"acc_norm\": 0.7138263665594855,\n \"acc_norm_stderr\": 0.025670259242188936\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7530864197530864,\n \"acc_stderr\": 0.02399350170904211,\n \"acc_norm\": 0.7530864197530864,\n \"acc_norm_stderr\": 0.02399350170904211\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4574468085106383,\n \"acc_stderr\": 0.02971928127223685,\n \"acc_norm\": 0.4574468085106383,\n \"acc_norm_stderr\": 0.02971928127223685\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4791395045632334,\n \"acc_stderr\": 0.012759117066518012,\n \"acc_norm\": 0.4791395045632334,\n \"acc_norm_stderr\": 0.012759117066518012\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.7058823529411765,\n \"acc_stderr\": 0.027678468642144717,\n \"acc_norm\": 0.7058823529411765,\n \"acc_norm_stderr\": 0.027678468642144717\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6650326797385621,\n \"acc_stderr\": 0.01909422816700033,\n \"acc_norm\": 0.6650326797385621,\n \"acc_norm_stderr\": 0.01909422816700033\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6454545454545455,\n \"acc_stderr\": 0.045820048415054174,\n \"acc_norm\": 0.6454545454545455,\n \"acc_norm_stderr\": 0.045820048415054174\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7428571428571429,\n \"acc_stderr\": 0.02797982353874455,\n \"acc_norm\": 0.7428571428571429,\n \"acc_norm_stderr\": 0.02797982353874455\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8557213930348259,\n \"acc_stderr\": 0.02484575321230604,\n \"acc_norm\": 0.8557213930348259,\n \"acc_norm_stderr\": 0.02484575321230604\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.87,\n \"acc_stderr\": 0.03379976689896309,\n \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.03379976689896309\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5301204819277109,\n \"acc_stderr\": 0.03885425420866767,\n \"acc_norm\": 0.5301204819277109,\n \"acc_norm_stderr\": 0.03885425420866767\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8128654970760234,\n \"acc_stderr\": 0.02991312723236804,\n \"acc_norm\": 0.8128654970760234,\n \"acc_norm_stderr\": 0.02991312723236804\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3880048959608323,\n \"mc1_stderr\": 0.017058761501347972,\n \"mc2\": 0.5684120643866822,\n \"mc2_stderr\": 0.015214628002199675\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8011049723756906,\n \"acc_stderr\": 0.011218629972515314\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6497346474601972,\n \"acc_stderr\": 0.013140409455571276\n }\n}\n```", "repo_url": "https://huggingface.co/Weyaxi/Newton-OpenHermes-2.5-neural-chat-v3-3-Slerp", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_02T03_42_19.232314", "path": ["**/details_harness|arc:challenge|25_2024-02-02T03-42-19.232314.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-02T03-42-19.232314.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_02T03_42_19.232314", "path": ["**/details_harness|gsm8k|5_2024-02-02T03-42-19.232314.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-02T03-42-19.232314.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_02T03_42_19.232314", "path": ["**/details_harness|hellaswag|10_2024-02-02T03-42-19.232314.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-02T03-42-19.232314.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_02T03_42_19.232314", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T03-42-19.232314.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-02T03-42-19.232314.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-02T03-42-19.232314.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T03-42-19.232314.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T03-42-19.232314.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-02T03-42-19.232314.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T03-42-19.232314.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T03-42-19.232314.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T03-42-19.232314.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T03-42-19.232314.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-02T03-42-19.232314.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-02T03-42-19.232314.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T03-42-19.232314.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-02T03-42-19.232314.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T03-42-19.232314.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T03-42-19.232314.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T03-42-19.232314.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-02T03-42-19.232314.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T03-42-19.232314.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T03-42-19.232314.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T03-42-19.232314.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T03-42-19.232314.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T03-42-19.232314.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T03-42-19.232314.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T03-42-19.232314.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T03-42-19.232314.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T03-42-19.232314.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T03-42-19.232314.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T03-42-19.232314.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T03-42-19.232314.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T03-42-19.232314.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T03-42-19.232314.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-02T03-42-19.232314.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T03-42-19.232314.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-02T03-42-19.232314.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T03-42-19.232314.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T03-42-19.232314.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T03-42-19.232314.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-02T03-42-19.232314.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-02T03-42-19.232314.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T03-42-19.232314.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T03-42-19.232314.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T03-42-19.232314.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T03-42-19.232314.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-02T03-42-19.232314.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-02T03-42-19.232314.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-02T03-42-19.232314.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T03-42-19.232314.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-02T03-42-19.232314.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T03-42-19.232314.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T03-42-19.232314.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-02T03-42-19.232314.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-02T03-42-19.232314.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-02T03-42-19.232314.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T03-42-19.232314.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-02T03-42-19.232314.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-02T03-42-19.232314.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T03-42-19.232314.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-02T03-42-19.232314.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-02T03-42-19.232314.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T03-42-19.232314.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T03-42-19.232314.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-02T03-42-19.232314.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T03-42-19.232314.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T03-42-19.232314.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T03-42-19.232314.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T03-42-19.232314.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-02T03-42-19.232314.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-02T03-42-19.232314.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T03-42-19.232314.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-02T03-42-19.232314.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T03-42-19.232314.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T03-42-19.232314.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T03-42-19.232314.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-02T03-42-19.232314.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T03-42-19.232314.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T03-42-19.232314.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T03-42-19.232314.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T03-42-19.232314.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T03-42-19.232314.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T03-42-19.232314.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T03-42-19.232314.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T03-42-19.232314.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T03-42-19.232314.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T03-42-19.232314.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T03-42-19.232314.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T03-42-19.232314.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T03-42-19.232314.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T03-42-19.232314.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-02T03-42-19.232314.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T03-42-19.232314.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-02T03-42-19.232314.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T03-42-19.232314.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T03-42-19.232314.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T03-42-19.232314.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-02T03-42-19.232314.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-02T03-42-19.232314.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T03-42-19.232314.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T03-42-19.232314.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T03-42-19.232314.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T03-42-19.232314.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-02T03-42-19.232314.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-02T03-42-19.232314.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-02T03-42-19.232314.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T03-42-19.232314.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-02T03-42-19.232314.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T03-42-19.232314.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T03-42-19.232314.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-02T03-42-19.232314.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-02T03-42-19.232314.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-02T03-42-19.232314.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T03-42-19.232314.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-02T03-42-19.232314.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-02T03-42-19.232314.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_02T03_42_19.232314", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T03-42-19.232314.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T03-42-19.232314.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_02T03_42_19.232314", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-02T03-42-19.232314.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-02T03-42-19.232314.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_02T03_42_19.232314", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-02T03-42-19.232314.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-02T03-42-19.232314.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_02T03_42_19.232314", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T03-42-19.232314.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T03-42-19.232314.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_02T03_42_19.232314", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T03-42-19.232314.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T03-42-19.232314.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_02T03_42_19.232314", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-02T03-42-19.232314.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-02T03-42-19.232314.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_02T03_42_19.232314", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T03-42-19.232314.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T03-42-19.232314.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_02T03_42_19.232314", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T03-42-19.232314.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T03-42-19.232314.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_02T03_42_19.232314", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T03-42-19.232314.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T03-42-19.232314.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_02T03_42_19.232314", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T03-42-19.232314.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T03-42-19.232314.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_02T03_42_19.232314", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-02T03-42-19.232314.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-02T03-42-19.232314.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_02T03_42_19.232314", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-02T03-42-19.232314.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-02T03-42-19.232314.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_02T03_42_19.232314", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T03-42-19.232314.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T03-42-19.232314.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_02T03_42_19.232314", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-02T03-42-19.232314.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-02T03-42-19.232314.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_02T03_42_19.232314", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T03-42-19.232314.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T03-42-19.232314.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_02T03_42_19.232314", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T03-42-19.232314.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T03-42-19.232314.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_02T03_42_19.232314", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T03-42-19.232314.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T03-42-19.232314.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_02T03_42_19.232314", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-02T03-42-19.232314.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-02T03-42-19.232314.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_02T03_42_19.232314", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T03-42-19.232314.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T03-42-19.232314.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_02T03_42_19.232314", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T03-42-19.232314.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T03-42-19.232314.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_02T03_42_19.232314", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T03-42-19.232314.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T03-42-19.232314.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_02T03_42_19.232314", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T03-42-19.232314.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T03-42-19.232314.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_02T03_42_19.232314", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T03-42-19.232314.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T03-42-19.232314.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_02T03_42_19.232314", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T03-42-19.232314.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T03-42-19.232314.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_02T03_42_19.232314", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T03-42-19.232314.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T03-42-19.232314.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_02T03_42_19.232314", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T03-42-19.232314.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T03-42-19.232314.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_02T03_42_19.232314", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T03-42-19.232314.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T03-42-19.232314.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_02T03_42_19.232314", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T03-42-19.232314.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T03-42-19.232314.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_02T03_42_19.232314", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T03-42-19.232314.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T03-42-19.232314.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_02T03_42_19.232314", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T03-42-19.232314.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T03-42-19.232314.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_02T03_42_19.232314", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T03-42-19.232314.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T03-42-19.232314.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_02T03_42_19.232314", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T03-42-19.232314.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T03-42-19.232314.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_02T03_42_19.232314", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-02T03-42-19.232314.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-02T03-42-19.232314.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_02T03_42_19.232314", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T03-42-19.232314.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T03-42-19.232314.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_02T03_42_19.232314", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-02T03-42-19.232314.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-02T03-42-19.232314.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_02T03_42_19.232314", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T03-42-19.232314.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T03-42-19.232314.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_02T03_42_19.232314", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T03-42-19.232314.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T03-42-19.232314.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_02T03_42_19.232314", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T03-42-19.232314.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T03-42-19.232314.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_02T03_42_19.232314", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-02T03-42-19.232314.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-02T03-42-19.232314.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_02T03_42_19.232314", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-02T03-42-19.232314.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-02T03-42-19.232314.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_02T03_42_19.232314", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T03-42-19.232314.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T03-42-19.232314.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_02T03_42_19.232314", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T03-42-19.232314.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T03-42-19.232314.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_02T03_42_19.232314", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T03-42-19.232314.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T03-42-19.232314.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_02T03_42_19.232314", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T03-42-19.232314.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T03-42-19.232314.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_02T03_42_19.232314", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-02T03-42-19.232314.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-02T03-42-19.232314.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_02T03_42_19.232314", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-02T03-42-19.232314.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-02T03-42-19.232314.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_02T03_42_19.232314", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-02T03-42-19.232314.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-02T03-42-19.232314.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_02T03_42_19.232314", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T03-42-19.232314.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T03-42-19.232314.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_02T03_42_19.232314", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-02T03-42-19.232314.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-02T03-42-19.232314.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_02T03_42_19.232314", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T03-42-19.232314.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T03-42-19.232314.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_02T03_42_19.232314", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T03-42-19.232314.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T03-42-19.232314.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_02T03_42_19.232314", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-02T03-42-19.232314.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-02T03-42-19.232314.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_02T03_42_19.232314", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-02T03-42-19.232314.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-02T03-42-19.232314.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_02T03_42_19.232314", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-02T03-42-19.232314.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-02T03-42-19.232314.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_02T03_42_19.232314", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T03-42-19.232314.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T03-42-19.232314.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_02T03_42_19.232314", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-02T03-42-19.232314.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-02T03-42-19.232314.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_02T03_42_19.232314", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-02T03-42-19.232314.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-02T03-42-19.232314.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_02T03_42_19.232314", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-02T03-42-19.232314.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-02T03-42-19.232314.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_02T03_42_19.232314", "path": ["**/details_harness|winogrande|5_2024-02-02T03-42-19.232314.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-02T03-42-19.232314.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_02T03_42_19.232314", "path": ["results_2024-02-02T03-42-19.232314.parquet"]}, {"split": "latest", "path": ["results_2024-02-02T03-42-19.232314.parquet"]}]}]}
2024-02-02T03:45:10+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of Weyaxi/Newton-OpenHermes-2.5-neural-chat-v3-3-Slerp Dataset automatically created during the evaluation run of model Weyaxi/Newton-OpenHermes-2.5-neural-chat-v3-3-Slerp on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-02-02T03:42:19.232314(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of Weyaxi/Newton-OpenHermes-2.5-neural-chat-v3-3-Slerp\n\n\n\nDataset automatically created during the evaluation run of model Weyaxi/Newton-OpenHermes-2.5-neural-chat-v3-3-Slerp on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-02-02T03:42:19.232314(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of Weyaxi/Newton-OpenHermes-2.5-neural-chat-v3-3-Slerp\n\n\n\nDataset automatically created during the evaluation run of model Weyaxi/Newton-OpenHermes-2.5-neural-chat-v3-3-Slerp on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-02-02T03:42:19.232314(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
7008efea31d02f0e6ba60c227ac8e4ddea91ed3c
# Dataset Card for Evaluation run of nisten/bigdoc-c34b-instruct-tf32 <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [nisten/bigdoc-c34b-instruct-tf32](https://huggingface.co/nisten/bigdoc-c34b-instruct-tf32) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_nisten__bigdoc-c34b-instruct-tf32", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-02-02T03:54:56.700611](https://huggingface.co/datasets/open-llm-leaderboard/details_nisten__bigdoc-c34b-instruct-tf32/blob/main/results_2024-02-02T03-54-56.700611.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.555723855403699, "acc_stderr": 0.034056643851026316, "acc_norm": 0.5596322402810356, "acc_norm_stderr": 0.034763618590594646, "mc1": 0.28518971848225216, "mc1_stderr": 0.015805827874454895, "mc2": 0.4446186897080597, "mc2_stderr": 0.014549361291628982 }, "harness|arc:challenge|25": { "acc": 0.5102389078498294, "acc_stderr": 0.014608326906285012, "acc_norm": 0.5443686006825939, "acc_norm_stderr": 0.01455374993930686 }, "harness|hellaswag|10": { "acc": 0.5637323242381995, "acc_stderr": 0.004949080334816023, "acc_norm": 0.7690699063931488, "acc_norm_stderr": 0.004205665144562955 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.33, "acc_stderr": 0.04725815626252606, "acc_norm": 0.33, "acc_norm_stderr": 0.04725815626252606 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.45925925925925926, "acc_stderr": 0.04304979692464244, "acc_norm": 0.45925925925925926, "acc_norm_stderr": 0.04304979692464244 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.6118421052631579, "acc_stderr": 0.03965842097512744, "acc_norm": 0.6118421052631579, "acc_norm_stderr": 0.03965842097512744 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.61, "acc_stderr": 0.04902071300001975, "acc_norm": 0.61, "acc_norm_stderr": 0.04902071300001975 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.49433962264150944, "acc_stderr": 0.030770900763851302, "acc_norm": 0.49433962264150944, "acc_norm_stderr": 0.030770900763851302 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.5, "acc_stderr": 0.04181210050035455, "acc_norm": 0.5, "acc_norm_stderr": 0.04181210050035455 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.43, "acc_stderr": 0.049756985195624284, "acc_norm": 0.43, "acc_norm_stderr": 0.049756985195624284 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.51, "acc_stderr": 0.05024183937956912, "acc_norm": 0.51, "acc_norm_stderr": 0.05024183937956912 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.36, "acc_stderr": 0.04824181513244218, "acc_norm": 0.36, "acc_norm_stderr": 0.04824181513244218 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.45664739884393063, "acc_stderr": 0.03798106566014499, "acc_norm": 0.45664739884393063, "acc_norm_stderr": 0.03798106566014499 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.3627450980392157, "acc_stderr": 0.04784060704105653, "acc_norm": 0.3627450980392157, "acc_norm_stderr": 0.04784060704105653 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.69, "acc_stderr": 0.04648231987117316, "acc_norm": 0.69, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.48936170212765956, "acc_stderr": 0.03267862331014063, "acc_norm": 0.48936170212765956, "acc_norm_stderr": 0.03267862331014063 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.42105263157894735, "acc_stderr": 0.04644602091222318, "acc_norm": 0.42105263157894735, "acc_norm_stderr": 0.04644602091222318 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5241379310344828, "acc_stderr": 0.041618085035015295, "acc_norm": 0.5241379310344828, "acc_norm_stderr": 0.041618085035015295 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.3941798941798942, "acc_stderr": 0.025167982333894143, "acc_norm": 0.3941798941798942, "acc_norm_stderr": 0.025167982333894143 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.4603174603174603, "acc_stderr": 0.04458029125470973, "acc_norm": 0.4603174603174603, "acc_norm_stderr": 0.04458029125470973 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.35, "acc_stderr": 0.047937248544110196, "acc_norm": 0.35, "acc_norm_stderr": 0.047937248544110196 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.603225806451613, "acc_stderr": 0.027831231605767948, "acc_norm": 0.603225806451613, "acc_norm_stderr": 0.027831231605767948 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.39901477832512317, "acc_stderr": 0.03445487686264716, "acc_norm": 0.39901477832512317, "acc_norm_stderr": 0.03445487686264716 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.68, "acc_stderr": 0.04688261722621505, "acc_norm": 0.68, "acc_norm_stderr": 0.04688261722621505 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.6727272727272727, "acc_stderr": 0.03663974994391242, "acc_norm": 0.6727272727272727, "acc_norm_stderr": 0.03663974994391242 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.7272727272727273, "acc_stderr": 0.031730712390717244, "acc_norm": 0.7272727272727273, "acc_norm_stderr": 0.031730712390717244 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.7772020725388601, "acc_stderr": 0.03003114797764154, "acc_norm": 0.7772020725388601, "acc_norm_stderr": 0.03003114797764154 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.5025641025641026, "acc_stderr": 0.025350672979412188, "acc_norm": 0.5025641025641026, "acc_norm_stderr": 0.025350672979412188 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.32222222222222224, "acc_stderr": 0.0284934650910286, "acc_norm": 0.32222222222222224, "acc_norm_stderr": 0.0284934650910286 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.5168067226890757, "acc_stderr": 0.03246013680375308, "acc_norm": 0.5168067226890757, "acc_norm_stderr": 0.03246013680375308 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.37748344370860926, "acc_stderr": 0.03958027231121569, "acc_norm": 0.37748344370860926, "acc_norm_stderr": 0.03958027231121569 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.7155963302752294, "acc_stderr": 0.01934203658770258, "acc_norm": 0.7155963302752294, "acc_norm_stderr": 0.01934203658770258 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.4166666666666667, "acc_stderr": 0.03362277436608044, "acc_norm": 0.4166666666666667, "acc_norm_stderr": 0.03362277436608044 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.7696078431372549, "acc_stderr": 0.029554292605695063, "acc_norm": 0.7696078431372549, "acc_norm_stderr": 0.029554292605695063 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.759493670886076, "acc_stderr": 0.02782078198114969, "acc_norm": 0.759493670886076, "acc_norm_stderr": 0.02782078198114969 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6053811659192825, "acc_stderr": 0.03280400504755291, "acc_norm": 0.6053811659192825, "acc_norm_stderr": 0.03280400504755291 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.6412213740458015, "acc_stderr": 0.04206739313864908, "acc_norm": 0.6412213740458015, "acc_norm_stderr": 0.04206739313864908 }, "harness|hendrycksTest-international_law|5": { "acc": 0.7355371900826446, "acc_stderr": 0.040261875275912046, "acc_norm": 0.7355371900826446, "acc_norm_stderr": 0.040261875275912046 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.6759259259259259, "acc_stderr": 0.04524596007030048, "acc_norm": 0.6759259259259259, "acc_norm_stderr": 0.04524596007030048 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.6932515337423313, "acc_stderr": 0.03623089915724146, "acc_norm": 0.6932515337423313, "acc_norm_stderr": 0.03623089915724146 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.4017857142857143, "acc_stderr": 0.04653333146973646, "acc_norm": 0.4017857142857143, "acc_norm_stderr": 0.04653333146973646 }, "harness|hendrycksTest-management|5": { "acc": 0.6699029126213593, "acc_stderr": 0.046561471100123514, "acc_norm": 0.6699029126213593, "acc_norm_stderr": 0.046561471100123514 }, "harness|hendrycksTest-marketing|5": { "acc": 0.811965811965812, "acc_stderr": 0.025598193686652265, "acc_norm": 0.811965811965812, "acc_norm_stderr": 0.025598193686652265 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.54, "acc_stderr": 0.05009082659620332, "acc_norm": 0.54, "acc_norm_stderr": 0.05009082659620332 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.7088122605363985, "acc_stderr": 0.0162460870697014, "acc_norm": 0.7088122605363985, "acc_norm_stderr": 0.0162460870697014 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.5751445086705202, "acc_stderr": 0.026613350840261736, "acc_norm": 0.5751445086705202, "acc_norm_stderr": 0.026613350840261736 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.3206703910614525, "acc_stderr": 0.0156099295593484, "acc_norm": 0.3206703910614525, "acc_norm_stderr": 0.0156099295593484 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.5915032679738562, "acc_stderr": 0.028146405993096358, "acc_norm": 0.5915032679738562, "acc_norm_stderr": 0.028146405993096358 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.6045016077170418, "acc_stderr": 0.027770918531427838, "acc_norm": 0.6045016077170418, "acc_norm_stderr": 0.027770918531427838 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.6049382716049383, "acc_stderr": 0.027201117666925657, "acc_norm": 0.6049382716049383, "acc_norm_stderr": 0.027201117666925657 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.41134751773049644, "acc_stderr": 0.02935491115994098, "acc_norm": 0.41134751773049644, "acc_norm_stderr": 0.02935491115994098 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.39308996088657105, "acc_stderr": 0.012474899613873956, "acc_norm": 0.39308996088657105, "acc_norm_stderr": 0.012474899613873956 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.46691176470588236, "acc_stderr": 0.03030625772246832, "acc_norm": 0.46691176470588236, "acc_norm_stderr": 0.03030625772246832 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.5, "acc_stderr": 0.020227834851568375, "acc_norm": 0.5, "acc_norm_stderr": 0.020227834851568375 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6727272727272727, "acc_stderr": 0.0449429086625209, "acc_norm": 0.6727272727272727, "acc_norm_stderr": 0.0449429086625209 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.6530612244897959, "acc_stderr": 0.030472526026726496, "acc_norm": 0.6530612244897959, "acc_norm_stderr": 0.030472526026726496 }, "harness|hendrycksTest-sociology|5": { "acc": 0.7711442786069652, "acc_stderr": 0.02970528405677243, "acc_norm": 0.7711442786069652, "acc_norm_stderr": 0.02970528405677243 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.78, "acc_stderr": 0.041633319989322626, "acc_norm": 0.78, "acc_norm_stderr": 0.041633319989322626 }, "harness|hendrycksTest-virology|5": { "acc": 0.43373493975903615, "acc_stderr": 0.03858158940685517, "acc_norm": 0.43373493975903615, "acc_norm_stderr": 0.03858158940685517 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.7485380116959064, "acc_stderr": 0.033275044238468436, "acc_norm": 0.7485380116959064, "acc_norm_stderr": 0.033275044238468436 }, "harness|truthfulqa:mc|0": { "mc1": 0.28518971848225216, "mc1_stderr": 0.015805827874454895, "mc2": 0.4446186897080597, "mc2_stderr": 0.014549361291628982 }, "harness|winogrande|5": { "acc": 0.744277821625888, "acc_stderr": 0.012261253845440474 }, "harness|gsm8k|5": { "acc": 0.37604245640636846, "acc_stderr": 0.01334253206484978 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_nisten__bigdoc-c34b-instruct-tf32
[ "region:us" ]
2024-02-02T03:57:19+00:00
{"pretty_name": "Evaluation run of nisten/bigdoc-c34b-instruct-tf32", "dataset_summary": "Dataset automatically created during the evaluation run of model [nisten/bigdoc-c34b-instruct-tf32](https://huggingface.co/nisten/bigdoc-c34b-instruct-tf32) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_nisten__bigdoc-c34b-instruct-tf32\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-02T03:54:56.700611](https://huggingface.co/datasets/open-llm-leaderboard/details_nisten__bigdoc-c34b-instruct-tf32/blob/main/results_2024-02-02T03-54-56.700611.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.555723855403699,\n \"acc_stderr\": 0.034056643851026316,\n \"acc_norm\": 0.5596322402810356,\n \"acc_norm_stderr\": 0.034763618590594646,\n \"mc1\": 0.28518971848225216,\n \"mc1_stderr\": 0.015805827874454895,\n \"mc2\": 0.4446186897080597,\n \"mc2_stderr\": 0.014549361291628982\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5102389078498294,\n \"acc_stderr\": 0.014608326906285012,\n \"acc_norm\": 0.5443686006825939,\n \"acc_norm_stderr\": 0.01455374993930686\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5637323242381995,\n \"acc_stderr\": 0.004949080334816023,\n \"acc_norm\": 0.7690699063931488,\n \"acc_norm_stderr\": 0.004205665144562955\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252606,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252606\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.45925925925925926,\n \"acc_stderr\": 0.04304979692464244,\n \"acc_norm\": 0.45925925925925926,\n \"acc_norm_stderr\": 0.04304979692464244\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6118421052631579,\n \"acc_stderr\": 0.03965842097512744,\n \"acc_norm\": 0.6118421052631579,\n \"acc_norm_stderr\": 0.03965842097512744\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.61,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.49433962264150944,\n \"acc_stderr\": 0.030770900763851302,\n \"acc_norm\": 0.49433962264150944,\n \"acc_norm_stderr\": 0.030770900763851302\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.04181210050035455,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.04181210050035455\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.45664739884393063,\n \"acc_stderr\": 0.03798106566014499,\n \"acc_norm\": 0.45664739884393063,\n \"acc_norm_stderr\": 0.03798106566014499\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.3627450980392157,\n \"acc_stderr\": 0.04784060704105653,\n \"acc_norm\": 0.3627450980392157,\n \"acc_norm_stderr\": 0.04784060704105653\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.48936170212765956,\n \"acc_stderr\": 0.03267862331014063,\n \"acc_norm\": 0.48936170212765956,\n \"acc_norm_stderr\": 0.03267862331014063\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.42105263157894735,\n \"acc_stderr\": 0.04644602091222318,\n \"acc_norm\": 0.42105263157894735,\n \"acc_norm_stderr\": 0.04644602091222318\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5241379310344828,\n \"acc_stderr\": 0.041618085035015295,\n \"acc_norm\": 0.5241379310344828,\n \"acc_norm_stderr\": 0.041618085035015295\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3941798941798942,\n \"acc_stderr\": 0.025167982333894143,\n \"acc_norm\": 0.3941798941798942,\n \"acc_norm_stderr\": 0.025167982333894143\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4603174603174603,\n \"acc_stderr\": 0.04458029125470973,\n \"acc_norm\": 0.4603174603174603,\n \"acc_norm_stderr\": 0.04458029125470973\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.603225806451613,\n \"acc_stderr\": 0.027831231605767948,\n \"acc_norm\": 0.603225806451613,\n \"acc_norm_stderr\": 0.027831231605767948\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.39901477832512317,\n \"acc_stderr\": 0.03445487686264716,\n \"acc_norm\": 0.39901477832512317,\n \"acc_norm_stderr\": 0.03445487686264716\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.6727272727272727,\n \"acc_stderr\": 0.03663974994391242,\n \"acc_norm\": 0.6727272727272727,\n \"acc_norm_stderr\": 0.03663974994391242\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7272727272727273,\n \"acc_stderr\": 0.031730712390717244,\n \"acc_norm\": 0.7272727272727273,\n \"acc_norm_stderr\": 0.031730712390717244\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.7772020725388601,\n \"acc_stderr\": 0.03003114797764154,\n \"acc_norm\": 0.7772020725388601,\n \"acc_norm_stderr\": 0.03003114797764154\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.5025641025641026,\n \"acc_stderr\": 0.025350672979412188,\n \"acc_norm\": 0.5025641025641026,\n \"acc_norm_stderr\": 0.025350672979412188\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.32222222222222224,\n \"acc_stderr\": 0.0284934650910286,\n \"acc_norm\": 0.32222222222222224,\n \"acc_norm_stderr\": 0.0284934650910286\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.5168067226890757,\n \"acc_stderr\": 0.03246013680375308,\n \"acc_norm\": 0.5168067226890757,\n \"acc_norm_stderr\": 0.03246013680375308\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.37748344370860926,\n \"acc_stderr\": 0.03958027231121569,\n \"acc_norm\": 0.37748344370860926,\n \"acc_norm_stderr\": 0.03958027231121569\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.7155963302752294,\n \"acc_stderr\": 0.01934203658770258,\n \"acc_norm\": 0.7155963302752294,\n \"acc_norm_stderr\": 0.01934203658770258\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4166666666666667,\n \"acc_stderr\": 0.03362277436608044,\n \"acc_norm\": 0.4166666666666667,\n \"acc_norm_stderr\": 0.03362277436608044\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7696078431372549,\n \"acc_stderr\": 0.029554292605695063,\n \"acc_norm\": 0.7696078431372549,\n \"acc_norm_stderr\": 0.029554292605695063\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.759493670886076,\n \"acc_stderr\": 0.02782078198114969,\n \"acc_norm\": 0.759493670886076,\n \"acc_norm_stderr\": 0.02782078198114969\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6053811659192825,\n \"acc_stderr\": 0.03280400504755291,\n \"acc_norm\": 0.6053811659192825,\n \"acc_norm_stderr\": 0.03280400504755291\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.6412213740458015,\n \"acc_stderr\": 0.04206739313864908,\n \"acc_norm\": 0.6412213740458015,\n \"acc_norm_stderr\": 0.04206739313864908\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7355371900826446,\n \"acc_stderr\": 0.040261875275912046,\n \"acc_norm\": 0.7355371900826446,\n \"acc_norm_stderr\": 0.040261875275912046\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6759259259259259,\n \"acc_stderr\": 0.04524596007030048,\n \"acc_norm\": 0.6759259259259259,\n \"acc_norm_stderr\": 0.04524596007030048\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.6932515337423313,\n \"acc_stderr\": 0.03623089915724146,\n \"acc_norm\": 0.6932515337423313,\n \"acc_norm_stderr\": 0.03623089915724146\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4017857142857143,\n \"acc_stderr\": 0.04653333146973646,\n \"acc_norm\": 0.4017857142857143,\n \"acc_norm_stderr\": 0.04653333146973646\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.6699029126213593,\n \"acc_stderr\": 0.046561471100123514,\n \"acc_norm\": 0.6699029126213593,\n \"acc_norm_stderr\": 0.046561471100123514\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.811965811965812,\n \"acc_stderr\": 0.025598193686652265,\n \"acc_norm\": 0.811965811965812,\n \"acc_norm_stderr\": 0.025598193686652265\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7088122605363985,\n \"acc_stderr\": 0.0162460870697014,\n \"acc_norm\": 0.7088122605363985,\n \"acc_norm_stderr\": 0.0162460870697014\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.5751445086705202,\n \"acc_stderr\": 0.026613350840261736,\n \"acc_norm\": 0.5751445086705202,\n \"acc_norm_stderr\": 0.026613350840261736\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3206703910614525,\n \"acc_stderr\": 0.0156099295593484,\n \"acc_norm\": 0.3206703910614525,\n \"acc_norm_stderr\": 0.0156099295593484\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.5915032679738562,\n \"acc_stderr\": 0.028146405993096358,\n \"acc_norm\": 0.5915032679738562,\n \"acc_norm_stderr\": 0.028146405993096358\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6045016077170418,\n \"acc_stderr\": 0.027770918531427838,\n \"acc_norm\": 0.6045016077170418,\n \"acc_norm_stderr\": 0.027770918531427838\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.6049382716049383,\n \"acc_stderr\": 0.027201117666925657,\n \"acc_norm\": 0.6049382716049383,\n \"acc_norm_stderr\": 0.027201117666925657\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.41134751773049644,\n \"acc_stderr\": 0.02935491115994098,\n \"acc_norm\": 0.41134751773049644,\n \"acc_norm_stderr\": 0.02935491115994098\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.39308996088657105,\n \"acc_stderr\": 0.012474899613873956,\n \"acc_norm\": 0.39308996088657105,\n \"acc_norm_stderr\": 0.012474899613873956\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.46691176470588236,\n \"acc_stderr\": 0.03030625772246832,\n \"acc_norm\": 0.46691176470588236,\n \"acc_norm_stderr\": 0.03030625772246832\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.020227834851568375,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.020227834851568375\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.6530612244897959,\n \"acc_stderr\": 0.030472526026726496,\n \"acc_norm\": 0.6530612244897959,\n \"acc_norm_stderr\": 0.030472526026726496\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7711442786069652,\n \"acc_stderr\": 0.02970528405677243,\n \"acc_norm\": 0.7711442786069652,\n \"acc_norm_stderr\": 0.02970528405677243\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.78,\n \"acc_stderr\": 0.041633319989322626,\n \"acc_norm\": 0.78,\n \"acc_norm_stderr\": 0.041633319989322626\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.43373493975903615,\n \"acc_stderr\": 0.03858158940685517,\n \"acc_norm\": 0.43373493975903615,\n \"acc_norm_stderr\": 0.03858158940685517\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.7485380116959064,\n \"acc_stderr\": 0.033275044238468436,\n \"acc_norm\": 0.7485380116959064,\n \"acc_norm_stderr\": 0.033275044238468436\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.28518971848225216,\n \"mc1_stderr\": 0.015805827874454895,\n \"mc2\": 0.4446186897080597,\n \"mc2_stderr\": 0.014549361291628982\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.744277821625888,\n \"acc_stderr\": 0.012261253845440474\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.37604245640636846,\n \"acc_stderr\": 0.01334253206484978\n }\n}\n```", "repo_url": "https://huggingface.co/nisten/bigdoc-c34b-instruct-tf32", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_02T03_54_56.700611", "path": ["**/details_harness|arc:challenge|25_2024-02-02T03-54-56.700611.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-02T03-54-56.700611.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_02T03_54_56.700611", "path": ["**/details_harness|gsm8k|5_2024-02-02T03-54-56.700611.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-02T03-54-56.700611.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_02T03_54_56.700611", "path": ["**/details_harness|hellaswag|10_2024-02-02T03-54-56.700611.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-02T03-54-56.700611.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_02T03_54_56.700611", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T03-54-56.700611.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-02T03-54-56.700611.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-02T03-54-56.700611.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T03-54-56.700611.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T03-54-56.700611.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-02T03-54-56.700611.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T03-54-56.700611.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T03-54-56.700611.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T03-54-56.700611.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T03-54-56.700611.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-02T03-54-56.700611.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-02T03-54-56.700611.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T03-54-56.700611.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-02T03-54-56.700611.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T03-54-56.700611.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T03-54-56.700611.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T03-54-56.700611.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-02T03-54-56.700611.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T03-54-56.700611.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T03-54-56.700611.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T03-54-56.700611.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T03-54-56.700611.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T03-54-56.700611.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T03-54-56.700611.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T03-54-56.700611.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T03-54-56.700611.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T03-54-56.700611.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T03-54-56.700611.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T03-54-56.700611.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T03-54-56.700611.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T03-54-56.700611.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T03-54-56.700611.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-02T03-54-56.700611.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T03-54-56.700611.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-02T03-54-56.700611.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T03-54-56.700611.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T03-54-56.700611.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T03-54-56.700611.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-02T03-54-56.700611.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-02T03-54-56.700611.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T03-54-56.700611.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T03-54-56.700611.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T03-54-56.700611.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T03-54-56.700611.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-02T03-54-56.700611.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-02T03-54-56.700611.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-02T03-54-56.700611.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T03-54-56.700611.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-02T03-54-56.700611.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T03-54-56.700611.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T03-54-56.700611.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-02T03-54-56.700611.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-02T03-54-56.700611.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-02T03-54-56.700611.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T03-54-56.700611.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-02T03-54-56.700611.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-02T03-54-56.700611.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T03-54-56.700611.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-02T03-54-56.700611.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-02T03-54-56.700611.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T03-54-56.700611.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T03-54-56.700611.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-02T03-54-56.700611.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T03-54-56.700611.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T03-54-56.700611.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T03-54-56.700611.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T03-54-56.700611.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-02T03-54-56.700611.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-02T03-54-56.700611.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T03-54-56.700611.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-02T03-54-56.700611.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T03-54-56.700611.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T03-54-56.700611.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T03-54-56.700611.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-02T03-54-56.700611.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T03-54-56.700611.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T03-54-56.700611.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T03-54-56.700611.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T03-54-56.700611.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T03-54-56.700611.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T03-54-56.700611.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T03-54-56.700611.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T03-54-56.700611.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T03-54-56.700611.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T03-54-56.700611.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T03-54-56.700611.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T03-54-56.700611.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T03-54-56.700611.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T03-54-56.700611.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-02T03-54-56.700611.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T03-54-56.700611.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-02T03-54-56.700611.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T03-54-56.700611.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T03-54-56.700611.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T03-54-56.700611.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-02T03-54-56.700611.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-02T03-54-56.700611.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T03-54-56.700611.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T03-54-56.700611.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T03-54-56.700611.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T03-54-56.700611.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-02T03-54-56.700611.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-02T03-54-56.700611.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-02T03-54-56.700611.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T03-54-56.700611.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-02T03-54-56.700611.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T03-54-56.700611.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T03-54-56.700611.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-02T03-54-56.700611.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-02T03-54-56.700611.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-02T03-54-56.700611.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T03-54-56.700611.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-02T03-54-56.700611.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-02T03-54-56.700611.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_02T03_54_56.700611", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T03-54-56.700611.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T03-54-56.700611.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_02T03_54_56.700611", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-02T03-54-56.700611.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-02T03-54-56.700611.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_02T03_54_56.700611", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-02T03-54-56.700611.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-02T03-54-56.700611.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_02T03_54_56.700611", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T03-54-56.700611.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T03-54-56.700611.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_02T03_54_56.700611", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T03-54-56.700611.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T03-54-56.700611.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_02T03_54_56.700611", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-02T03-54-56.700611.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-02T03-54-56.700611.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_02T03_54_56.700611", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T03-54-56.700611.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T03-54-56.700611.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_02T03_54_56.700611", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T03-54-56.700611.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T03-54-56.700611.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_02T03_54_56.700611", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T03-54-56.700611.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T03-54-56.700611.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_02T03_54_56.700611", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T03-54-56.700611.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T03-54-56.700611.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_02T03_54_56.700611", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-02T03-54-56.700611.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-02T03-54-56.700611.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_02T03_54_56.700611", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-02T03-54-56.700611.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-02T03-54-56.700611.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_02T03_54_56.700611", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T03-54-56.700611.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T03-54-56.700611.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_02T03_54_56.700611", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-02T03-54-56.700611.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-02T03-54-56.700611.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_02T03_54_56.700611", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T03-54-56.700611.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T03-54-56.700611.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_02T03_54_56.700611", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T03-54-56.700611.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T03-54-56.700611.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_02T03_54_56.700611", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T03-54-56.700611.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T03-54-56.700611.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_02T03_54_56.700611", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-02T03-54-56.700611.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-02T03-54-56.700611.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_02T03_54_56.700611", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T03-54-56.700611.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T03-54-56.700611.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_02T03_54_56.700611", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T03-54-56.700611.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T03-54-56.700611.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_02T03_54_56.700611", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T03-54-56.700611.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T03-54-56.700611.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_02T03_54_56.700611", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T03-54-56.700611.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T03-54-56.700611.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_02T03_54_56.700611", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T03-54-56.700611.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T03-54-56.700611.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_02T03_54_56.700611", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T03-54-56.700611.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T03-54-56.700611.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_02T03_54_56.700611", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T03-54-56.700611.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T03-54-56.700611.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_02T03_54_56.700611", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T03-54-56.700611.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T03-54-56.700611.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_02T03_54_56.700611", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T03-54-56.700611.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T03-54-56.700611.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_02T03_54_56.700611", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T03-54-56.700611.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T03-54-56.700611.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_02T03_54_56.700611", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T03-54-56.700611.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T03-54-56.700611.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_02T03_54_56.700611", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T03-54-56.700611.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T03-54-56.700611.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_02T03_54_56.700611", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T03-54-56.700611.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T03-54-56.700611.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_02T03_54_56.700611", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T03-54-56.700611.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T03-54-56.700611.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_02T03_54_56.700611", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-02T03-54-56.700611.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-02T03-54-56.700611.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_02T03_54_56.700611", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T03-54-56.700611.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T03-54-56.700611.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_02T03_54_56.700611", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-02T03-54-56.700611.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-02T03-54-56.700611.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_02T03_54_56.700611", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T03-54-56.700611.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T03-54-56.700611.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_02T03_54_56.700611", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T03-54-56.700611.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T03-54-56.700611.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_02T03_54_56.700611", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T03-54-56.700611.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T03-54-56.700611.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_02T03_54_56.700611", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-02T03-54-56.700611.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-02T03-54-56.700611.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_02T03_54_56.700611", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-02T03-54-56.700611.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-02T03-54-56.700611.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_02T03_54_56.700611", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T03-54-56.700611.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T03-54-56.700611.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_02T03_54_56.700611", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T03-54-56.700611.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T03-54-56.700611.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_02T03_54_56.700611", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T03-54-56.700611.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T03-54-56.700611.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_02T03_54_56.700611", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T03-54-56.700611.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T03-54-56.700611.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_02T03_54_56.700611", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-02T03-54-56.700611.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-02T03-54-56.700611.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_02T03_54_56.700611", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-02T03-54-56.700611.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-02T03-54-56.700611.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_02T03_54_56.700611", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-02T03-54-56.700611.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-02T03-54-56.700611.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_02T03_54_56.700611", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T03-54-56.700611.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T03-54-56.700611.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_02T03_54_56.700611", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-02T03-54-56.700611.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-02T03-54-56.700611.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_02T03_54_56.700611", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T03-54-56.700611.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T03-54-56.700611.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_02T03_54_56.700611", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T03-54-56.700611.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T03-54-56.700611.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_02T03_54_56.700611", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-02T03-54-56.700611.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-02T03-54-56.700611.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_02T03_54_56.700611", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-02T03-54-56.700611.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-02T03-54-56.700611.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_02T03_54_56.700611", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-02T03-54-56.700611.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-02T03-54-56.700611.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_02T03_54_56.700611", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T03-54-56.700611.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T03-54-56.700611.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_02T03_54_56.700611", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-02T03-54-56.700611.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-02T03-54-56.700611.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_02T03_54_56.700611", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-02T03-54-56.700611.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-02T03-54-56.700611.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_02T03_54_56.700611", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-02T03-54-56.700611.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-02T03-54-56.700611.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_02T03_54_56.700611", "path": ["**/details_harness|winogrande|5_2024-02-02T03-54-56.700611.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-02T03-54-56.700611.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_02T03_54_56.700611", "path": ["results_2024-02-02T03-54-56.700611.parquet"]}, {"split": "latest", "path": ["results_2024-02-02T03-54-56.700611.parquet"]}]}]}
2024-02-02T03:57:47+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of nisten/bigdoc-c34b-instruct-tf32 Dataset automatically created during the evaluation run of model nisten/bigdoc-c34b-instruct-tf32 on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-02-02T03:54:56.700611(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of nisten/bigdoc-c34b-instruct-tf32\n\n\n\nDataset automatically created during the evaluation run of model nisten/bigdoc-c34b-instruct-tf32 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-02-02T03:54:56.700611(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of nisten/bigdoc-c34b-instruct-tf32\n\n\n\nDataset automatically created during the evaluation run of model nisten/bigdoc-c34b-instruct-tf32 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-02-02T03:54:56.700611(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
3f3054c568d9bb08cff56738af4ca4bf48cadb53
# Dataset Card for Evaluation run of ConvexAI/Pelican-9b-v0.1 <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [ConvexAI/Pelican-9b-v0.1](https://huggingface.co/ConvexAI/Pelican-9b-v0.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_ConvexAI__Pelican-9b-v0.1", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-02-02T15:07:35.883760](https://huggingface.co/datasets/open-llm-leaderboard/details_ConvexAI__Pelican-9b-v0.1/blob/main/results_2024-02-02T15-07-35.883760.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6135784069632323, "acc_stderr": 0.032209768316442185, "acc_norm": 0.6265622474266279, "acc_norm_stderr": 0.033093604406938995, "mc1": 0.24969400244798043, "mc1_stderr": 0.015152286907148125, "mc2": 0.5061156023040165, "mc2_stderr": 0.01650422871794908 }, "harness|arc:challenge|25": { "acc": 0.4189419795221843, "acc_stderr": 0.014418106953639015, "acc_norm": 0.47952218430034127, "acc_norm_stderr": 0.014599131353035004 }, "harness|hellaswag|10": { "acc": 0.4372634933280223, "acc_stderr": 0.004950347333701834, "acc_norm": 0.6622186815375424, "acc_norm_stderr": 0.004719870074967236 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.33, "acc_stderr": 0.04725815626252606, "acc_norm": 0.33, "acc_norm_stderr": 0.04725815626252606 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.6222222222222222, "acc_stderr": 0.04188307537595853, "acc_norm": 0.6222222222222222, "acc_norm_stderr": 0.04188307537595853 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.6907894736842105, "acc_stderr": 0.037610708698674805, "acc_norm": 0.6907894736842105, "acc_norm_stderr": 0.037610708698674805 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.62, "acc_stderr": 0.048783173121456316, "acc_norm": 0.62, "acc_norm_stderr": 0.048783173121456316 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.7094339622641509, "acc_stderr": 0.027943219989337128, "acc_norm": 0.7094339622641509, "acc_norm_stderr": 0.027943219989337128 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.7569444444444444, "acc_stderr": 0.03586879280080341, "acc_norm": 0.7569444444444444, "acc_norm_stderr": 0.03586879280080341 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.46, "acc_stderr": 0.05009082659620332, "acc_norm": 0.46, "acc_norm_stderr": 0.05009082659620332 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.49, "acc_stderr": 0.05024183937956912, "acc_norm": 0.49, "acc_norm_stderr": 0.05024183937956912 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.33, "acc_stderr": 0.047258156262526045, "acc_norm": 0.33, "acc_norm_stderr": 0.047258156262526045 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.653179190751445, "acc_stderr": 0.036291466701596636, "acc_norm": 0.653179190751445, "acc_norm_stderr": 0.036291466701596636 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.4019607843137255, "acc_stderr": 0.048786087144669955, "acc_norm": 0.4019607843137255, "acc_norm_stderr": 0.048786087144669955 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.73, "acc_stderr": 0.0446196043338474, "acc_norm": 0.73, "acc_norm_stderr": 0.0446196043338474 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.5574468085106383, "acc_stderr": 0.03246956919789958, "acc_norm": 0.5574468085106383, "acc_norm_stderr": 0.03246956919789958 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.45614035087719296, "acc_stderr": 0.046854730419077895, "acc_norm": 0.45614035087719296, "acc_norm_stderr": 0.046854730419077895 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5310344827586206, "acc_stderr": 0.04158632762097828, "acc_norm": 0.5310344827586206, "acc_norm_stderr": 0.04158632762097828 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.41798941798941797, "acc_stderr": 0.025402555503260912, "acc_norm": 0.41798941798941797, "acc_norm_stderr": 0.025402555503260912 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.4444444444444444, "acc_stderr": 0.044444444444444495, "acc_norm": 0.4444444444444444, "acc_norm_stderr": 0.044444444444444495 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.32, "acc_stderr": 0.046882617226215034, "acc_norm": 0.32, "acc_norm_stderr": 0.046882617226215034 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.7741935483870968, "acc_stderr": 0.023785577884181012, "acc_norm": 0.7741935483870968, "acc_norm_stderr": 0.023785577884181012 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.47783251231527096, "acc_stderr": 0.035145285621750094, "acc_norm": 0.47783251231527096, "acc_norm_stderr": 0.035145285621750094 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.64, "acc_stderr": 0.048241815132442176, "acc_norm": 0.64, "acc_norm_stderr": 0.048241815132442176 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7757575757575758, "acc_stderr": 0.032568666616811015, "acc_norm": 0.7757575757575758, "acc_norm_stderr": 0.032568666616811015 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.7575757575757576, "acc_stderr": 0.03053289223393202, "acc_norm": 0.7575757575757576, "acc_norm_stderr": 0.03053289223393202 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.9067357512953368, "acc_stderr": 0.020986854593289733, "acc_norm": 0.9067357512953368, "acc_norm_stderr": 0.020986854593289733 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.6410256410256411, "acc_stderr": 0.02432173848460235, "acc_norm": 0.6410256410256411, "acc_norm_stderr": 0.02432173848460235 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.3111111111111111, "acc_stderr": 0.02822644674968352, "acc_norm": 0.3111111111111111, "acc_norm_stderr": 0.02822644674968352 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.6386554621848739, "acc_stderr": 0.031204691225150016, "acc_norm": 0.6386554621848739, "acc_norm_stderr": 0.031204691225150016 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.304635761589404, "acc_stderr": 0.03757949922943343, "acc_norm": 0.304635761589404, "acc_norm_stderr": 0.03757949922943343 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8330275229357799, "acc_stderr": 0.01599015488507338, "acc_norm": 0.8330275229357799, "acc_norm_stderr": 0.01599015488507338 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.41203703703703703, "acc_stderr": 0.03356787758160835, "acc_norm": 0.41203703703703703, "acc_norm_stderr": 0.03356787758160835 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.8480392156862745, "acc_stderr": 0.025195658428931792, "acc_norm": 0.8480392156862745, "acc_norm_stderr": 0.025195658428931792 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.7805907172995781, "acc_stderr": 0.026939106581553945, "acc_norm": 0.7805907172995781, "acc_norm_stderr": 0.026939106581553945 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6995515695067265, "acc_stderr": 0.03076935200822914, "acc_norm": 0.6995515695067265, "acc_norm_stderr": 0.03076935200822914 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.7480916030534351, "acc_stderr": 0.03807387116306085, "acc_norm": 0.7480916030534351, "acc_norm_stderr": 0.03807387116306085 }, "harness|hendrycksTest-international_law|5": { "acc": 0.7768595041322314, "acc_stderr": 0.03800754475228732, "acc_norm": 0.7768595041322314, "acc_norm_stderr": 0.03800754475228732 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7777777777777778, "acc_stderr": 0.0401910747255735, "acc_norm": 0.7777777777777778, "acc_norm_stderr": 0.0401910747255735 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7484662576687117, "acc_stderr": 0.03408997886857529, "acc_norm": 0.7484662576687117, "acc_norm_stderr": 0.03408997886857529 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.5089285714285714, "acc_stderr": 0.04745033255489123, "acc_norm": 0.5089285714285714, "acc_norm_stderr": 0.04745033255489123 }, "harness|hendrycksTest-management|5": { "acc": 0.8155339805825242, "acc_stderr": 0.03840423627288276, "acc_norm": 0.8155339805825242, "acc_norm_stderr": 0.03840423627288276 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8760683760683761, "acc_stderr": 0.02158649400128137, "acc_norm": 0.8760683760683761, "acc_norm_stderr": 0.02158649400128137 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.66, "acc_stderr": 0.04760952285695237, "acc_norm": 0.66, "acc_norm_stderr": 0.04760952285695237 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8212005108556832, "acc_stderr": 0.013702643715368976, "acc_norm": 0.8212005108556832, "acc_norm_stderr": 0.013702643715368976 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.7023121387283237, "acc_stderr": 0.024617055388677003, "acc_norm": 0.7023121387283237, "acc_norm_stderr": 0.024617055388677003 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.3474860335195531, "acc_stderr": 0.01592556406020815, "acc_norm": 0.3474860335195531, "acc_norm_stderr": 0.01592556406020815 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.7026143790849673, "acc_stderr": 0.026173908506718576, "acc_norm": 0.7026143790849673, "acc_norm_stderr": 0.026173908506718576 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.7009646302250804, "acc_stderr": 0.02600330111788514, "acc_norm": 0.7009646302250804, "acc_norm_stderr": 0.02600330111788514 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.7129629629629629, "acc_stderr": 0.02517104191530968, "acc_norm": 0.7129629629629629, "acc_norm_stderr": 0.02517104191530968 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.4574468085106383, "acc_stderr": 0.029719281272236837, "acc_norm": 0.4574468085106383, "acc_norm_stderr": 0.029719281272236837 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.46153846153846156, "acc_stderr": 0.01273239828619044, "acc_norm": 0.46153846153846156, "acc_norm_stderr": 0.01273239828619044 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.6433823529411765, "acc_stderr": 0.02909720956841195, "acc_norm": 0.6433823529411765, "acc_norm_stderr": 0.02909720956841195 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6666666666666666, "acc_stderr": 0.019070985589687492, "acc_norm": 0.6666666666666666, "acc_norm_stderr": 0.019070985589687492 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6363636363636364, "acc_stderr": 0.046075820907199756, "acc_norm": 0.6363636363636364, "acc_norm_stderr": 0.046075820907199756 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.6816326530612244, "acc_stderr": 0.029822533793982066, "acc_norm": 0.6816326530612244, "acc_norm_stderr": 0.029822533793982066 }, "harness|hendrycksTest-sociology|5": { "acc": 0.8557213930348259, "acc_stderr": 0.024845753212306053, "acc_norm": 0.8557213930348259, "acc_norm_stderr": 0.024845753212306053 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.84, "acc_stderr": 0.03684529491774708, "acc_norm": 0.84, "acc_norm_stderr": 0.03684529491774708 }, "harness|hendrycksTest-virology|5": { "acc": 0.5481927710843374, "acc_stderr": 0.03874371556587953, "acc_norm": 0.5481927710843374, "acc_norm_stderr": 0.03874371556587953 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8128654970760234, "acc_stderr": 0.02991312723236804, "acc_norm": 0.8128654970760234, "acc_norm_stderr": 0.02991312723236804 }, "harness|truthfulqa:mc|0": { "mc1": 0.24969400244798043, "mc1_stderr": 0.015152286907148125, "mc2": 0.5061156023040165, "mc2_stderr": 0.01650422871794908 }, "harness|winogrande|5": { "acc": 0.7466456195737964, "acc_stderr": 0.012223754434233633 }, "harness|gsm8k|5": { "acc": 0.0, "acc_stderr": 0.0 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_ConvexAI__Pelican-9b-v0.1
[ "region:us" ]
2024-02-02T03:58:24+00:00
{"pretty_name": "Evaluation run of ConvexAI/Pelican-9b-v0.1", "dataset_summary": "Dataset automatically created during the evaluation run of model [ConvexAI/Pelican-9b-v0.1](https://huggingface.co/ConvexAI/Pelican-9b-v0.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_ConvexAI__Pelican-9b-v0.1\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-02T15:07:35.883760](https://huggingface.co/datasets/open-llm-leaderboard/details_ConvexAI__Pelican-9b-v0.1/blob/main/results_2024-02-02T15-07-35.883760.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6135784069632323,\n \"acc_stderr\": 0.032209768316442185,\n \"acc_norm\": 0.6265622474266279,\n \"acc_norm_stderr\": 0.033093604406938995,\n \"mc1\": 0.24969400244798043,\n \"mc1_stderr\": 0.015152286907148125,\n \"mc2\": 0.5061156023040165,\n \"mc2_stderr\": 0.01650422871794908\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.4189419795221843,\n \"acc_stderr\": 0.014418106953639015,\n \"acc_norm\": 0.47952218430034127,\n \"acc_norm_stderr\": 0.014599131353035004\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.4372634933280223,\n \"acc_stderr\": 0.004950347333701834,\n \"acc_norm\": 0.6622186815375424,\n \"acc_norm_stderr\": 0.004719870074967236\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252606,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252606\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6222222222222222,\n \"acc_stderr\": 0.04188307537595853,\n \"acc_norm\": 0.6222222222222222,\n \"acc_norm_stderr\": 0.04188307537595853\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6907894736842105,\n \"acc_stderr\": 0.037610708698674805,\n \"acc_norm\": 0.6907894736842105,\n \"acc_norm_stderr\": 0.037610708698674805\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.62,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.62,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7094339622641509,\n \"acc_stderr\": 0.027943219989337128,\n \"acc_norm\": 0.7094339622641509,\n \"acc_norm_stderr\": 0.027943219989337128\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7569444444444444,\n \"acc_stderr\": 0.03586879280080341,\n \"acc_norm\": 0.7569444444444444,\n \"acc_norm_stderr\": 0.03586879280080341\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.653179190751445,\n \"acc_stderr\": 0.036291466701596636,\n \"acc_norm\": 0.653179190751445,\n \"acc_norm_stderr\": 0.036291466701596636\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4019607843137255,\n \"acc_stderr\": 0.048786087144669955,\n \"acc_norm\": 0.4019607843137255,\n \"acc_norm_stderr\": 0.048786087144669955\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.73,\n \"acc_stderr\": 0.0446196043338474,\n \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5574468085106383,\n \"acc_stderr\": 0.03246956919789958,\n \"acc_norm\": 0.5574468085106383,\n \"acc_norm_stderr\": 0.03246956919789958\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.45614035087719296,\n \"acc_stderr\": 0.046854730419077895,\n \"acc_norm\": 0.45614035087719296,\n \"acc_norm_stderr\": 0.046854730419077895\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5310344827586206,\n \"acc_stderr\": 0.04158632762097828,\n \"acc_norm\": 0.5310344827586206,\n \"acc_norm_stderr\": 0.04158632762097828\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.41798941798941797,\n \"acc_stderr\": 0.025402555503260912,\n \"acc_norm\": 0.41798941798941797,\n \"acc_norm_stderr\": 0.025402555503260912\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4444444444444444,\n \"acc_stderr\": 0.044444444444444495,\n \"acc_norm\": 0.4444444444444444,\n \"acc_norm_stderr\": 0.044444444444444495\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7741935483870968,\n \"acc_stderr\": 0.023785577884181012,\n \"acc_norm\": 0.7741935483870968,\n \"acc_norm_stderr\": 0.023785577884181012\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.47783251231527096,\n \"acc_stderr\": 0.035145285621750094,\n \"acc_norm\": 0.47783251231527096,\n \"acc_norm_stderr\": 0.035145285621750094\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.64,\n \"acc_stderr\": 0.048241815132442176,\n \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.048241815132442176\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.032568666616811015,\n \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.032568666616811015\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7575757575757576,\n \"acc_stderr\": 0.03053289223393202,\n \"acc_norm\": 0.7575757575757576,\n \"acc_norm_stderr\": 0.03053289223393202\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9067357512953368,\n \"acc_stderr\": 0.020986854593289733,\n \"acc_norm\": 0.9067357512953368,\n \"acc_norm_stderr\": 0.020986854593289733\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6410256410256411,\n \"acc_stderr\": 0.02432173848460235,\n \"acc_norm\": 0.6410256410256411,\n \"acc_norm_stderr\": 0.02432173848460235\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3111111111111111,\n \"acc_stderr\": 0.02822644674968352,\n \"acc_norm\": 0.3111111111111111,\n \"acc_norm_stderr\": 0.02822644674968352\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6386554621848739,\n \"acc_stderr\": 0.031204691225150016,\n \"acc_norm\": 0.6386554621848739,\n \"acc_norm_stderr\": 0.031204691225150016\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.304635761589404,\n \"acc_stderr\": 0.03757949922943343,\n \"acc_norm\": 0.304635761589404,\n \"acc_norm_stderr\": 0.03757949922943343\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8330275229357799,\n \"acc_stderr\": 0.01599015488507338,\n \"acc_norm\": 0.8330275229357799,\n \"acc_norm_stderr\": 0.01599015488507338\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.41203703703703703,\n \"acc_stderr\": 0.03356787758160835,\n \"acc_norm\": 0.41203703703703703,\n \"acc_norm_stderr\": 0.03356787758160835\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8480392156862745,\n \"acc_stderr\": 0.025195658428931792,\n \"acc_norm\": 0.8480392156862745,\n \"acc_norm_stderr\": 0.025195658428931792\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7805907172995781,\n \"acc_stderr\": 0.026939106581553945,\n \"acc_norm\": 0.7805907172995781,\n \"acc_norm_stderr\": 0.026939106581553945\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6995515695067265,\n \"acc_stderr\": 0.03076935200822914,\n \"acc_norm\": 0.6995515695067265,\n \"acc_norm_stderr\": 0.03076935200822914\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7480916030534351,\n \"acc_stderr\": 0.03807387116306085,\n \"acc_norm\": 0.7480916030534351,\n \"acc_norm_stderr\": 0.03807387116306085\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228732,\n \"acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228732\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.0401910747255735,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.0401910747255735\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7484662576687117,\n \"acc_stderr\": 0.03408997886857529,\n \"acc_norm\": 0.7484662576687117,\n \"acc_norm_stderr\": 0.03408997886857529\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5089285714285714,\n \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.5089285714285714,\n \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8155339805825242,\n \"acc_stderr\": 0.03840423627288276,\n \"acc_norm\": 0.8155339805825242,\n \"acc_norm_stderr\": 0.03840423627288276\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8760683760683761,\n \"acc_stderr\": 0.02158649400128137,\n \"acc_norm\": 0.8760683760683761,\n \"acc_norm_stderr\": 0.02158649400128137\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\": 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8212005108556832,\n \"acc_stderr\": 0.013702643715368976,\n \"acc_norm\": 0.8212005108556832,\n \"acc_norm_stderr\": 0.013702643715368976\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7023121387283237,\n \"acc_stderr\": 0.024617055388677003,\n \"acc_norm\": 0.7023121387283237,\n \"acc_norm_stderr\": 0.024617055388677003\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3474860335195531,\n \"acc_stderr\": 0.01592556406020815,\n \"acc_norm\": 0.3474860335195531,\n \"acc_norm_stderr\": 0.01592556406020815\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7026143790849673,\n \"acc_stderr\": 0.026173908506718576,\n \"acc_norm\": 0.7026143790849673,\n \"acc_norm_stderr\": 0.026173908506718576\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7009646302250804,\n \"acc_stderr\": 0.02600330111788514,\n \"acc_norm\": 0.7009646302250804,\n \"acc_norm_stderr\": 0.02600330111788514\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7129629629629629,\n \"acc_stderr\": 0.02517104191530968,\n \"acc_norm\": 0.7129629629629629,\n \"acc_norm_stderr\": 0.02517104191530968\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4574468085106383,\n \"acc_stderr\": 0.029719281272236837,\n \"acc_norm\": 0.4574468085106383,\n \"acc_norm_stderr\": 0.029719281272236837\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46153846153846156,\n \"acc_stderr\": 0.01273239828619044,\n \"acc_norm\": 0.46153846153846156,\n \"acc_norm_stderr\": 0.01273239828619044\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6433823529411765,\n \"acc_stderr\": 0.02909720956841195,\n \"acc_norm\": 0.6433823529411765,\n \"acc_norm_stderr\": 0.02909720956841195\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.019070985589687492,\n \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.019070985589687492\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6363636363636364,\n \"acc_stderr\": 0.046075820907199756,\n \"acc_norm\": 0.6363636363636364,\n \"acc_norm_stderr\": 0.046075820907199756\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.6816326530612244,\n \"acc_stderr\": 0.029822533793982066,\n \"acc_norm\": 0.6816326530612244,\n \"acc_norm_stderr\": 0.029822533793982066\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8557213930348259,\n \"acc_stderr\": 0.024845753212306053,\n \"acc_norm\": 0.8557213930348259,\n \"acc_norm_stderr\": 0.024845753212306053\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774708,\n \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774708\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5481927710843374,\n \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.5481927710843374,\n \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8128654970760234,\n \"acc_stderr\": 0.02991312723236804,\n \"acc_norm\": 0.8128654970760234,\n \"acc_norm_stderr\": 0.02991312723236804\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.24969400244798043,\n \"mc1_stderr\": 0.015152286907148125,\n \"mc2\": 0.5061156023040165,\n \"mc2_stderr\": 0.01650422871794908\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7466456195737964,\n \"acc_stderr\": 0.012223754434233633\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n }\n}\n```", "repo_url": "https://huggingface.co/ConvexAI/Pelican-9b-v0.1", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_02T03_56_08.046783", "path": ["**/details_harness|arc:challenge|25_2024-02-02T03-56-08.046783.parquet"]}, {"split": "2024_02_02T07_28_13.538776", "path": ["**/details_harness|arc:challenge|25_2024-02-02T07-28-13.538776.parquet"]}, {"split": "2024_02_02T15_07_35.883760", "path": ["**/details_harness|arc:challenge|25_2024-02-02T15-07-35.883760.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-02T15-07-35.883760.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_02T03_56_08.046783", "path": ["**/details_harness|gsm8k|5_2024-02-02T03-56-08.046783.parquet"]}, {"split": "2024_02_02T07_28_13.538776", "path": ["**/details_harness|gsm8k|5_2024-02-02T07-28-13.538776.parquet"]}, {"split": "2024_02_02T15_07_35.883760", "path": ["**/details_harness|gsm8k|5_2024-02-02T15-07-35.883760.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-02T15-07-35.883760.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_02T03_56_08.046783", "path": ["**/details_harness|hellaswag|10_2024-02-02T03-56-08.046783.parquet"]}, {"split": "2024_02_02T07_28_13.538776", "path": ["**/details_harness|hellaswag|10_2024-02-02T07-28-13.538776.parquet"]}, {"split": "2024_02_02T15_07_35.883760", "path": ["**/details_harness|hellaswag|10_2024-02-02T15-07-35.883760.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-02T15-07-35.883760.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_02T03_56_08.046783", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T03-56-08.046783.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-02T03-56-08.046783.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-02T03-56-08.046783.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T03-56-08.046783.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T03-56-08.046783.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-02T03-56-08.046783.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T03-56-08.046783.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T03-56-08.046783.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T03-56-08.046783.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T03-56-08.046783.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-02T03-56-08.046783.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-02T03-56-08.046783.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T03-56-08.046783.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-02T03-56-08.046783.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T03-56-08.046783.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T03-56-08.046783.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T03-56-08.046783.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-02T03-56-08.046783.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T03-56-08.046783.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T03-56-08.046783.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T03-56-08.046783.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T03-56-08.046783.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T03-56-08.046783.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T03-56-08.046783.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T03-56-08.046783.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T03-56-08.046783.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T03-56-08.046783.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T03-56-08.046783.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T03-56-08.046783.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T03-56-08.046783.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T03-56-08.046783.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T03-56-08.046783.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-02T03-56-08.046783.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T03-56-08.046783.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-02T03-56-08.046783.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T03-56-08.046783.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T03-56-08.046783.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T03-56-08.046783.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-02T03-56-08.046783.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-02T03-56-08.046783.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T03-56-08.046783.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T03-56-08.046783.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T03-56-08.046783.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T03-56-08.046783.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-02T03-56-08.046783.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-02T03-56-08.046783.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-02T03-56-08.046783.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T03-56-08.046783.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-02T03-56-08.046783.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T03-56-08.046783.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T03-56-08.046783.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-02T03-56-08.046783.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-02T03-56-08.046783.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-02T03-56-08.046783.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T03-56-08.046783.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-02T03-56-08.046783.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-02T03-56-08.046783.parquet"]}, {"split": "2024_02_02T07_28_13.538776", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T07-28-13.538776.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-02T07-28-13.538776.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-02T07-28-13.538776.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T07-28-13.538776.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T07-28-13.538776.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-02T07-28-13.538776.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T07-28-13.538776.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T07-28-13.538776.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T07-28-13.538776.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T07-28-13.538776.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-02T07-28-13.538776.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-02T07-28-13.538776.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T07-28-13.538776.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-02T07-28-13.538776.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T07-28-13.538776.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T07-28-13.538776.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T07-28-13.538776.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-02T07-28-13.538776.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T07-28-13.538776.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T07-28-13.538776.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T07-28-13.538776.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T07-28-13.538776.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T07-28-13.538776.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T07-28-13.538776.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T07-28-13.538776.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T07-28-13.538776.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T07-28-13.538776.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T07-28-13.538776.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T07-28-13.538776.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T07-28-13.538776.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T07-28-13.538776.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T07-28-13.538776.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-02T07-28-13.538776.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T07-28-13.538776.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-02T07-28-13.538776.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T07-28-13.538776.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T07-28-13.538776.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T07-28-13.538776.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-02T07-28-13.538776.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-02T07-28-13.538776.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T07-28-13.538776.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T07-28-13.538776.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T07-28-13.538776.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T07-28-13.538776.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-02T07-28-13.538776.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-02T07-28-13.538776.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-02T07-28-13.538776.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T07-28-13.538776.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-02T07-28-13.538776.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T07-28-13.538776.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T07-28-13.538776.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-02T07-28-13.538776.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-02T07-28-13.538776.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-02T07-28-13.538776.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T07-28-13.538776.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-02T07-28-13.538776.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-02T07-28-13.538776.parquet"]}, {"split": "2024_02_02T15_07_35.883760", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T15-07-35.883760.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-02T15-07-35.883760.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-02T15-07-35.883760.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T15-07-35.883760.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T15-07-35.883760.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-02T15-07-35.883760.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T15-07-35.883760.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T15-07-35.883760.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T15-07-35.883760.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T15-07-35.883760.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-02T15-07-35.883760.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-02T15-07-35.883760.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T15-07-35.883760.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-02T15-07-35.883760.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T15-07-35.883760.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T15-07-35.883760.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T15-07-35.883760.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-02T15-07-35.883760.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T15-07-35.883760.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T15-07-35.883760.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T15-07-35.883760.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T15-07-35.883760.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T15-07-35.883760.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T15-07-35.883760.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T15-07-35.883760.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T15-07-35.883760.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T15-07-35.883760.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T15-07-35.883760.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T15-07-35.883760.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T15-07-35.883760.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T15-07-35.883760.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T15-07-35.883760.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-02T15-07-35.883760.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T15-07-35.883760.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-02T15-07-35.883760.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T15-07-35.883760.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T15-07-35.883760.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T15-07-35.883760.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-02T15-07-35.883760.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-02T15-07-35.883760.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T15-07-35.883760.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T15-07-35.883760.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T15-07-35.883760.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T15-07-35.883760.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-02T15-07-35.883760.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-02T15-07-35.883760.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-02T15-07-35.883760.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T15-07-35.883760.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-02T15-07-35.883760.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T15-07-35.883760.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T15-07-35.883760.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-02T15-07-35.883760.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-02T15-07-35.883760.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-02T15-07-35.883760.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T15-07-35.883760.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-02T15-07-35.883760.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-02T15-07-35.883760.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T15-07-35.883760.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-02T15-07-35.883760.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-02T15-07-35.883760.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T15-07-35.883760.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T15-07-35.883760.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-02T15-07-35.883760.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T15-07-35.883760.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T15-07-35.883760.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T15-07-35.883760.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T15-07-35.883760.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-02T15-07-35.883760.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-02T15-07-35.883760.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T15-07-35.883760.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-02T15-07-35.883760.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T15-07-35.883760.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T15-07-35.883760.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T15-07-35.883760.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-02T15-07-35.883760.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T15-07-35.883760.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T15-07-35.883760.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T15-07-35.883760.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T15-07-35.883760.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T15-07-35.883760.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T15-07-35.883760.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T15-07-35.883760.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T15-07-35.883760.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T15-07-35.883760.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T15-07-35.883760.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T15-07-35.883760.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T15-07-35.883760.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T15-07-35.883760.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T15-07-35.883760.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-02T15-07-35.883760.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T15-07-35.883760.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-02T15-07-35.883760.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T15-07-35.883760.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T15-07-35.883760.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T15-07-35.883760.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-02T15-07-35.883760.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-02T15-07-35.883760.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T15-07-35.883760.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T15-07-35.883760.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T15-07-35.883760.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T15-07-35.883760.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-02T15-07-35.883760.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-02T15-07-35.883760.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-02T15-07-35.883760.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T15-07-35.883760.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-02T15-07-35.883760.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T15-07-35.883760.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T15-07-35.883760.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-02T15-07-35.883760.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-02T15-07-35.883760.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-02T15-07-35.883760.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T15-07-35.883760.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-02T15-07-35.883760.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-02T15-07-35.883760.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_02T03_56_08.046783", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T03-56-08.046783.parquet"]}, {"split": "2024_02_02T07_28_13.538776", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T07-28-13.538776.parquet"]}, {"split": "2024_02_02T15_07_35.883760", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T15-07-35.883760.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T15-07-35.883760.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_02T03_56_08.046783", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-02T03-56-08.046783.parquet"]}, {"split": "2024_02_02T07_28_13.538776", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-02T07-28-13.538776.parquet"]}, {"split": "2024_02_02T15_07_35.883760", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-02T15-07-35.883760.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-02T15-07-35.883760.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_02T03_56_08.046783", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-02T03-56-08.046783.parquet"]}, {"split": "2024_02_02T07_28_13.538776", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-02T07-28-13.538776.parquet"]}, {"split": "2024_02_02T15_07_35.883760", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-02T15-07-35.883760.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-02T15-07-35.883760.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_02T03_56_08.046783", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T03-56-08.046783.parquet"]}, {"split": "2024_02_02T07_28_13.538776", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T07-28-13.538776.parquet"]}, {"split": "2024_02_02T15_07_35.883760", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T15-07-35.883760.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T15-07-35.883760.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_02T03_56_08.046783", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T03-56-08.046783.parquet"]}, {"split": "2024_02_02T07_28_13.538776", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T07-28-13.538776.parquet"]}, {"split": "2024_02_02T15_07_35.883760", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T15-07-35.883760.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T15-07-35.883760.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_02T03_56_08.046783", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-02T03-56-08.046783.parquet"]}, {"split": "2024_02_02T07_28_13.538776", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-02T07-28-13.538776.parquet"]}, {"split": "2024_02_02T15_07_35.883760", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-02T15-07-35.883760.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-02T15-07-35.883760.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_02T03_56_08.046783", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T03-56-08.046783.parquet"]}, {"split": "2024_02_02T07_28_13.538776", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T07-28-13.538776.parquet"]}, {"split": "2024_02_02T15_07_35.883760", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T15-07-35.883760.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T15-07-35.883760.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_02T03_56_08.046783", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T03-56-08.046783.parquet"]}, {"split": "2024_02_02T07_28_13.538776", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T07-28-13.538776.parquet"]}, {"split": "2024_02_02T15_07_35.883760", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T15-07-35.883760.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T15-07-35.883760.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_02T03_56_08.046783", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T03-56-08.046783.parquet"]}, {"split": "2024_02_02T07_28_13.538776", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T07-28-13.538776.parquet"]}, {"split": "2024_02_02T15_07_35.883760", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T15-07-35.883760.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T15-07-35.883760.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_02T03_56_08.046783", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T03-56-08.046783.parquet"]}, {"split": "2024_02_02T07_28_13.538776", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T07-28-13.538776.parquet"]}, {"split": "2024_02_02T15_07_35.883760", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T15-07-35.883760.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T15-07-35.883760.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_02T03_56_08.046783", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-02T03-56-08.046783.parquet"]}, {"split": "2024_02_02T07_28_13.538776", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-02T07-28-13.538776.parquet"]}, {"split": "2024_02_02T15_07_35.883760", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-02T15-07-35.883760.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-02T15-07-35.883760.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_02T03_56_08.046783", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-02T03-56-08.046783.parquet"]}, {"split": "2024_02_02T07_28_13.538776", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-02T07-28-13.538776.parquet"]}, {"split": "2024_02_02T15_07_35.883760", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-02T15-07-35.883760.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-02T15-07-35.883760.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_02T03_56_08.046783", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T03-56-08.046783.parquet"]}, {"split": "2024_02_02T07_28_13.538776", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T07-28-13.538776.parquet"]}, {"split": "2024_02_02T15_07_35.883760", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T15-07-35.883760.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T15-07-35.883760.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_02T03_56_08.046783", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-02T03-56-08.046783.parquet"]}, {"split": "2024_02_02T07_28_13.538776", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-02T07-28-13.538776.parquet"]}, {"split": "2024_02_02T15_07_35.883760", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-02T15-07-35.883760.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-02T15-07-35.883760.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_02T03_56_08.046783", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T03-56-08.046783.parquet"]}, {"split": "2024_02_02T07_28_13.538776", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T07-28-13.538776.parquet"]}, {"split": "2024_02_02T15_07_35.883760", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T15-07-35.883760.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T15-07-35.883760.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_02T03_56_08.046783", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T03-56-08.046783.parquet"]}, {"split": "2024_02_02T07_28_13.538776", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T07-28-13.538776.parquet"]}, {"split": "2024_02_02T15_07_35.883760", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T15-07-35.883760.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T15-07-35.883760.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_02T03_56_08.046783", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T03-56-08.046783.parquet"]}, {"split": "2024_02_02T07_28_13.538776", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T07-28-13.538776.parquet"]}, {"split": "2024_02_02T15_07_35.883760", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T15-07-35.883760.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T15-07-35.883760.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_02T03_56_08.046783", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-02T03-56-08.046783.parquet"]}, {"split": "2024_02_02T07_28_13.538776", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-02T07-28-13.538776.parquet"]}, {"split": "2024_02_02T15_07_35.883760", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-02T15-07-35.883760.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-02T15-07-35.883760.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_02T03_56_08.046783", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T03-56-08.046783.parquet"]}, {"split": "2024_02_02T07_28_13.538776", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T07-28-13.538776.parquet"]}, {"split": "2024_02_02T15_07_35.883760", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T15-07-35.883760.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T15-07-35.883760.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_02T03_56_08.046783", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T03-56-08.046783.parquet"]}, {"split": "2024_02_02T07_28_13.538776", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T07-28-13.538776.parquet"]}, {"split": "2024_02_02T15_07_35.883760", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T15-07-35.883760.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T15-07-35.883760.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_02T03_56_08.046783", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T03-56-08.046783.parquet"]}, {"split": "2024_02_02T07_28_13.538776", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T07-28-13.538776.parquet"]}, {"split": "2024_02_02T15_07_35.883760", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T15-07-35.883760.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T15-07-35.883760.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_02T03_56_08.046783", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T03-56-08.046783.parquet"]}, {"split": "2024_02_02T07_28_13.538776", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T07-28-13.538776.parquet"]}, {"split": "2024_02_02T15_07_35.883760", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T15-07-35.883760.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T15-07-35.883760.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_02T03_56_08.046783", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T03-56-08.046783.parquet"]}, {"split": "2024_02_02T07_28_13.538776", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T07-28-13.538776.parquet"]}, {"split": "2024_02_02T15_07_35.883760", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T15-07-35.883760.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T15-07-35.883760.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_02T03_56_08.046783", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T03-56-08.046783.parquet"]}, {"split": "2024_02_02T07_28_13.538776", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T07-28-13.538776.parquet"]}, {"split": "2024_02_02T15_07_35.883760", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T15-07-35.883760.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T15-07-35.883760.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_02T03_56_08.046783", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T03-56-08.046783.parquet"]}, {"split": "2024_02_02T07_28_13.538776", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T07-28-13.538776.parquet"]}, {"split": "2024_02_02T15_07_35.883760", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T15-07-35.883760.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T15-07-35.883760.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_02T03_56_08.046783", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T03-56-08.046783.parquet"]}, {"split": "2024_02_02T07_28_13.538776", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T07-28-13.538776.parquet"]}, {"split": "2024_02_02T15_07_35.883760", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T15-07-35.883760.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T15-07-35.883760.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_02T03_56_08.046783", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T03-56-08.046783.parquet"]}, {"split": "2024_02_02T07_28_13.538776", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T07-28-13.538776.parquet"]}, {"split": "2024_02_02T15_07_35.883760", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T15-07-35.883760.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T15-07-35.883760.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_02T03_56_08.046783", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T03-56-08.046783.parquet"]}, {"split": "2024_02_02T07_28_13.538776", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T07-28-13.538776.parquet"]}, {"split": "2024_02_02T15_07_35.883760", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T15-07-35.883760.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T15-07-35.883760.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_02T03_56_08.046783", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T03-56-08.046783.parquet"]}, {"split": "2024_02_02T07_28_13.538776", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T07-28-13.538776.parquet"]}, {"split": "2024_02_02T15_07_35.883760", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T15-07-35.883760.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T15-07-35.883760.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_02T03_56_08.046783", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T03-56-08.046783.parquet"]}, {"split": "2024_02_02T07_28_13.538776", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T07-28-13.538776.parquet"]}, {"split": "2024_02_02T15_07_35.883760", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T15-07-35.883760.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T15-07-35.883760.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_02T03_56_08.046783", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T03-56-08.046783.parquet"]}, {"split": "2024_02_02T07_28_13.538776", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T07-28-13.538776.parquet"]}, {"split": "2024_02_02T15_07_35.883760", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T15-07-35.883760.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T15-07-35.883760.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_02T03_56_08.046783", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T03-56-08.046783.parquet"]}, {"split": "2024_02_02T07_28_13.538776", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T07-28-13.538776.parquet"]}, {"split": "2024_02_02T15_07_35.883760", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T15-07-35.883760.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T15-07-35.883760.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_02T03_56_08.046783", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-02T03-56-08.046783.parquet"]}, {"split": "2024_02_02T07_28_13.538776", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-02T07-28-13.538776.parquet"]}, {"split": "2024_02_02T15_07_35.883760", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-02T15-07-35.883760.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-02T15-07-35.883760.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_02T03_56_08.046783", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T03-56-08.046783.parquet"]}, {"split": "2024_02_02T07_28_13.538776", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T07-28-13.538776.parquet"]}, {"split": "2024_02_02T15_07_35.883760", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T15-07-35.883760.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T15-07-35.883760.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_02T03_56_08.046783", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-02T03-56-08.046783.parquet"]}, {"split": "2024_02_02T07_28_13.538776", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-02T07-28-13.538776.parquet"]}, {"split": "2024_02_02T15_07_35.883760", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-02T15-07-35.883760.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-02T15-07-35.883760.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_02T03_56_08.046783", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T03-56-08.046783.parquet"]}, {"split": "2024_02_02T07_28_13.538776", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T07-28-13.538776.parquet"]}, {"split": "2024_02_02T15_07_35.883760", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T15-07-35.883760.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T15-07-35.883760.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_02T03_56_08.046783", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T03-56-08.046783.parquet"]}, {"split": "2024_02_02T07_28_13.538776", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T07-28-13.538776.parquet"]}, {"split": "2024_02_02T15_07_35.883760", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T15-07-35.883760.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T15-07-35.883760.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_02T03_56_08.046783", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T03-56-08.046783.parquet"]}, {"split": "2024_02_02T07_28_13.538776", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T07-28-13.538776.parquet"]}, {"split": "2024_02_02T15_07_35.883760", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T15-07-35.883760.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T15-07-35.883760.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_02T03_56_08.046783", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-02T03-56-08.046783.parquet"]}, {"split": "2024_02_02T07_28_13.538776", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-02T07-28-13.538776.parquet"]}, {"split": "2024_02_02T15_07_35.883760", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-02T15-07-35.883760.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-02T15-07-35.883760.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_02T03_56_08.046783", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-02T03-56-08.046783.parquet"]}, {"split": "2024_02_02T07_28_13.538776", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-02T07-28-13.538776.parquet"]}, {"split": "2024_02_02T15_07_35.883760", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-02T15-07-35.883760.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-02T15-07-35.883760.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_02T03_56_08.046783", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T03-56-08.046783.parquet"]}, {"split": "2024_02_02T07_28_13.538776", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T07-28-13.538776.parquet"]}, {"split": "2024_02_02T15_07_35.883760", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T15-07-35.883760.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T15-07-35.883760.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_02T03_56_08.046783", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T03-56-08.046783.parquet"]}, {"split": "2024_02_02T07_28_13.538776", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T07-28-13.538776.parquet"]}, {"split": "2024_02_02T15_07_35.883760", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T15-07-35.883760.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T15-07-35.883760.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_02T03_56_08.046783", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T03-56-08.046783.parquet"]}, {"split": "2024_02_02T07_28_13.538776", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T07-28-13.538776.parquet"]}, {"split": "2024_02_02T15_07_35.883760", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T15-07-35.883760.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T15-07-35.883760.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_02T03_56_08.046783", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T03-56-08.046783.parquet"]}, {"split": "2024_02_02T07_28_13.538776", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T07-28-13.538776.parquet"]}, {"split": "2024_02_02T15_07_35.883760", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T15-07-35.883760.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T15-07-35.883760.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_02T03_56_08.046783", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-02T03-56-08.046783.parquet"]}, {"split": "2024_02_02T07_28_13.538776", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-02T07-28-13.538776.parquet"]}, {"split": "2024_02_02T15_07_35.883760", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-02T15-07-35.883760.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-02T15-07-35.883760.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_02T03_56_08.046783", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-02T03-56-08.046783.parquet"]}, {"split": "2024_02_02T07_28_13.538776", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-02T07-28-13.538776.parquet"]}, {"split": "2024_02_02T15_07_35.883760", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-02T15-07-35.883760.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-02T15-07-35.883760.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_02T03_56_08.046783", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-02T03-56-08.046783.parquet"]}, {"split": "2024_02_02T07_28_13.538776", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-02T07-28-13.538776.parquet"]}, {"split": "2024_02_02T15_07_35.883760", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-02T15-07-35.883760.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-02T15-07-35.883760.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_02T03_56_08.046783", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T03-56-08.046783.parquet"]}, {"split": "2024_02_02T07_28_13.538776", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T07-28-13.538776.parquet"]}, {"split": "2024_02_02T15_07_35.883760", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T15-07-35.883760.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T15-07-35.883760.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_02T03_56_08.046783", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-02T03-56-08.046783.parquet"]}, {"split": "2024_02_02T07_28_13.538776", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-02T07-28-13.538776.parquet"]}, {"split": "2024_02_02T15_07_35.883760", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-02T15-07-35.883760.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-02T15-07-35.883760.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_02T03_56_08.046783", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T03-56-08.046783.parquet"]}, {"split": "2024_02_02T07_28_13.538776", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T07-28-13.538776.parquet"]}, {"split": "2024_02_02T15_07_35.883760", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T15-07-35.883760.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T15-07-35.883760.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_02T03_56_08.046783", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T03-56-08.046783.parquet"]}, {"split": "2024_02_02T07_28_13.538776", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T07-28-13.538776.parquet"]}, {"split": "2024_02_02T15_07_35.883760", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T15-07-35.883760.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T15-07-35.883760.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_02T03_56_08.046783", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-02T03-56-08.046783.parquet"]}, {"split": "2024_02_02T07_28_13.538776", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-02T07-28-13.538776.parquet"]}, {"split": "2024_02_02T15_07_35.883760", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-02T15-07-35.883760.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-02T15-07-35.883760.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_02T03_56_08.046783", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-02T03-56-08.046783.parquet"]}, {"split": "2024_02_02T07_28_13.538776", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-02T07-28-13.538776.parquet"]}, {"split": "2024_02_02T15_07_35.883760", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-02T15-07-35.883760.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-02T15-07-35.883760.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_02T03_56_08.046783", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-02T03-56-08.046783.parquet"]}, {"split": "2024_02_02T07_28_13.538776", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-02T07-28-13.538776.parquet"]}, {"split": "2024_02_02T15_07_35.883760", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-02T15-07-35.883760.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-02T15-07-35.883760.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_02T03_56_08.046783", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T03-56-08.046783.parquet"]}, {"split": "2024_02_02T07_28_13.538776", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T07-28-13.538776.parquet"]}, {"split": "2024_02_02T15_07_35.883760", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T15-07-35.883760.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T15-07-35.883760.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_02T03_56_08.046783", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-02T03-56-08.046783.parquet"]}, {"split": "2024_02_02T07_28_13.538776", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-02T07-28-13.538776.parquet"]}, {"split": "2024_02_02T15_07_35.883760", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-02T15-07-35.883760.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-02T15-07-35.883760.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_02T03_56_08.046783", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-02T03-56-08.046783.parquet"]}, {"split": "2024_02_02T07_28_13.538776", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-02T07-28-13.538776.parquet"]}, {"split": "2024_02_02T15_07_35.883760", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-02T15-07-35.883760.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-02T15-07-35.883760.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_02T03_56_08.046783", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-02T03-56-08.046783.parquet"]}, {"split": "2024_02_02T07_28_13.538776", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-02T07-28-13.538776.parquet"]}, {"split": "2024_02_02T15_07_35.883760", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-02T15-07-35.883760.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-02T15-07-35.883760.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_02T03_56_08.046783", "path": ["**/details_harness|winogrande|5_2024-02-02T03-56-08.046783.parquet"]}, {"split": "2024_02_02T07_28_13.538776", "path": ["**/details_harness|winogrande|5_2024-02-02T07-28-13.538776.parquet"]}, {"split": "2024_02_02T15_07_35.883760", "path": ["**/details_harness|winogrande|5_2024-02-02T15-07-35.883760.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-02T15-07-35.883760.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_02T03_56_08.046783", "path": ["results_2024-02-02T03-56-08.046783.parquet"]}, {"split": "2024_02_02T07_28_13.538776", "path": ["results_2024-02-02T07-28-13.538776.parquet"]}, {"split": "2024_02_02T15_07_35.883760", "path": ["results_2024-02-02T15-07-35.883760.parquet"]}, {"split": "latest", "path": ["results_2024-02-02T15-07-35.883760.parquet"]}]}]}
2024-02-02T15:10:18+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of ConvexAI/Pelican-9b-v0.1 Dataset automatically created during the evaluation run of model ConvexAI/Pelican-9b-v0.1 on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-02-02T15:07:35.883760(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of ConvexAI/Pelican-9b-v0.1\n\n\n\nDataset automatically created during the evaluation run of model ConvexAI/Pelican-9b-v0.1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-02-02T15:07:35.883760(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of ConvexAI/Pelican-9b-v0.1\n\n\n\nDataset automatically created during the evaluation run of model ConvexAI/Pelican-9b-v0.1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-02-02T15:07:35.883760(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
4b408af5f080d68b394a623a71423cb9deabbff2
# Dataset Card for Evaluation run of kwchoi/DPO_mistral_v01_7b_ultra_0131_1k_1epoch <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [kwchoi/DPO_mistral_v01_7b_ultra_0131_1k_1epoch](https://huggingface.co/kwchoi/DPO_mistral_v01_7b_ultra_0131_1k_1epoch) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_kwchoi__DPO_mistral_v01_7b_ultra_0131_1k_1epoch", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-02-02T03:59:38.771636](https://huggingface.co/datasets/open-llm-leaderboard/details_kwchoi__DPO_mistral_v01_7b_ultra_0131_1k_1epoch/blob/main/results_2024-02-02T03-59-38.771636.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.5581636704911594, "acc_stderr": 0.03393607257808976, "acc_norm": 0.5631863759377935, "acc_norm_stderr": 0.034649068536847794, "mc1": 0.408812729498164, "mc1_stderr": 0.01720995215164173, "mc2": 0.5794497843139403, "mc2_stderr": 0.015364305691423665 }, "harness|arc:challenge|25": { "acc": 0.5366894197952219, "acc_stderr": 0.01457200052775699, "acc_norm": 0.5597269624573379, "acc_norm_stderr": 0.014506769524804234 }, "harness|hellaswag|10": { "acc": 0.5780720971917944, "acc_stderr": 0.004928578106026369, "acc_norm": 0.7677753435570603, "acc_norm_stderr": 0.004213885798268822 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.34, "acc_stderr": 0.047609522856952365, "acc_norm": 0.34, "acc_norm_stderr": 0.047609522856952365 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.45185185185185184, "acc_stderr": 0.04299268905480864, "acc_norm": 0.45185185185185184, "acc_norm_stderr": 0.04299268905480864 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.5723684210526315, "acc_stderr": 0.04026097083296563, "acc_norm": 0.5723684210526315, "acc_norm_stderr": 0.04026097083296563 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.55, "acc_stderr": 0.04999999999999999, "acc_norm": 0.55, "acc_norm_stderr": 0.04999999999999999 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.5962264150943396, "acc_stderr": 0.03019761160019795, "acc_norm": 0.5962264150943396, "acc_norm_stderr": 0.03019761160019795 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.625, "acc_stderr": 0.04048439222695598, "acc_norm": 0.625, "acc_norm_stderr": 0.04048439222695598 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.41, "acc_stderr": 0.04943110704237102, "acc_norm": 0.41, "acc_norm_stderr": 0.04943110704237102 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.53, "acc_stderr": 0.050161355804659205, "acc_norm": 0.53, "acc_norm_stderr": 0.050161355804659205 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.35, "acc_stderr": 0.047937248544110196, "acc_norm": 0.35, "acc_norm_stderr": 0.047937248544110196 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.5433526011560693, "acc_stderr": 0.03798106566014498, "acc_norm": 0.5433526011560693, "acc_norm_stderr": 0.03798106566014498 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.3137254901960784, "acc_stderr": 0.04617034827006718, "acc_norm": 0.3137254901960784, "acc_norm_stderr": 0.04617034827006718 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.72, "acc_stderr": 0.04512608598542128, "acc_norm": 0.72, "acc_norm_stderr": 0.04512608598542128 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.5106382978723404, "acc_stderr": 0.03267862331014063, "acc_norm": 0.5106382978723404, "acc_norm_stderr": 0.03267862331014063 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.38596491228070173, "acc_stderr": 0.045796394220704334, "acc_norm": 0.38596491228070173, "acc_norm_stderr": 0.045796394220704334 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5448275862068965, "acc_stderr": 0.04149886942192117, "acc_norm": 0.5448275862068965, "acc_norm_stderr": 0.04149886942192117 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.36243386243386244, "acc_stderr": 0.024757473902752045, "acc_norm": 0.36243386243386244, "acc_norm_stderr": 0.024757473902752045 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.38095238095238093, "acc_stderr": 0.043435254289490965, "acc_norm": 0.38095238095238093, "acc_norm_stderr": 0.043435254289490965 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.31, "acc_stderr": 0.04648231987117316, "acc_norm": 0.31, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.6451612903225806, "acc_stderr": 0.027218889773308753, "acc_norm": 0.6451612903225806, "acc_norm_stderr": 0.027218889773308753 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.39901477832512317, "acc_stderr": 0.03445487686264715, "acc_norm": 0.39901477832512317, "acc_norm_stderr": 0.03445487686264715 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.52, "acc_stderr": 0.050211673156867795, "acc_norm": 0.52, "acc_norm_stderr": 0.050211673156867795 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.6787878787878788, "acc_stderr": 0.036462049632538115, "acc_norm": 0.6787878787878788, "acc_norm_stderr": 0.036462049632538115 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.7171717171717171, "acc_stderr": 0.032087795587867514, "acc_norm": 0.7171717171717171, "acc_norm_stderr": 0.032087795587867514 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.7305699481865285, "acc_stderr": 0.03201867122877794, "acc_norm": 0.7305699481865285, "acc_norm_stderr": 0.03201867122877794 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.5307692307692308, "acc_stderr": 0.025302958890850154, "acc_norm": 0.5307692307692308, "acc_norm_stderr": 0.025302958890850154 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.3, "acc_stderr": 0.027940457136228416, "acc_norm": 0.3, "acc_norm_stderr": 0.027940457136228416 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.5504201680672269, "acc_stderr": 0.03231293497137707, "acc_norm": 0.5504201680672269, "acc_norm_stderr": 0.03231293497137707 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.32450331125827814, "acc_stderr": 0.03822746937658753, "acc_norm": 0.32450331125827814, "acc_norm_stderr": 0.03822746937658753 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.7247706422018348, "acc_stderr": 0.019149093743155203, "acc_norm": 0.7247706422018348, "acc_norm_stderr": 0.019149093743155203 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.4675925925925926, "acc_stderr": 0.034028015813589656, "acc_norm": 0.4675925925925926, "acc_norm_stderr": 0.034028015813589656 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.696078431372549, "acc_stderr": 0.03228210387037892, "acc_norm": 0.696078431372549, "acc_norm_stderr": 0.03228210387037892 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.6962025316455697, "acc_stderr": 0.029936696387138598, "acc_norm": 0.6962025316455697, "acc_norm_stderr": 0.029936696387138598 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6591928251121076, "acc_stderr": 0.0318114974705536, "acc_norm": 0.6591928251121076, "acc_norm_stderr": 0.0318114974705536 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.6717557251908397, "acc_stderr": 0.04118438565806298, "acc_norm": 0.6717557251908397, "acc_norm_stderr": 0.04118438565806298 }, "harness|hendrycksTest-international_law|5": { "acc": 0.6776859504132231, "acc_stderr": 0.042664163633521685, "acc_norm": 0.6776859504132231, "acc_norm_stderr": 0.042664163633521685 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7037037037037037, "acc_stderr": 0.04414343666854933, "acc_norm": 0.7037037037037037, "acc_norm_stderr": 0.04414343666854933 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.6748466257668712, "acc_stderr": 0.03680350371286461, "acc_norm": 0.6748466257668712, "acc_norm_stderr": 0.03680350371286461 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.4732142857142857, "acc_stderr": 0.047389751192741546, "acc_norm": 0.4732142857142857, "acc_norm_stderr": 0.047389751192741546 }, "harness|hendrycksTest-management|5": { "acc": 0.6796116504854369, "acc_stderr": 0.04620284082280041, "acc_norm": 0.6796116504854369, "acc_norm_stderr": 0.04620284082280041 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8547008547008547, "acc_stderr": 0.023086635086841407, "acc_norm": 0.8547008547008547, "acc_norm_stderr": 0.023086635086841407 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.64, "acc_stderr": 0.04824181513244218, "acc_norm": 0.64, "acc_norm_stderr": 0.04824181513244218 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.7445721583652618, "acc_stderr": 0.015594955384455772, "acc_norm": 0.7445721583652618, "acc_norm_stderr": 0.015594955384455772 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.6040462427745664, "acc_stderr": 0.026329813341946243, "acc_norm": 0.6040462427745664, "acc_norm_stderr": 0.026329813341946243 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.23910614525139665, "acc_stderr": 0.014265554192331152, "acc_norm": 0.23910614525139665, "acc_norm_stderr": 0.014265554192331152 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.6143790849673203, "acc_stderr": 0.027870745278290268, "acc_norm": 0.6143790849673203, "acc_norm_stderr": 0.027870745278290268 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.6237942122186495, "acc_stderr": 0.02751392568354943, "acc_norm": 0.6237942122186495, "acc_norm_stderr": 0.02751392568354943 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.5925925925925926, "acc_stderr": 0.027339546640662727, "acc_norm": 0.5925925925925926, "acc_norm_stderr": 0.027339546640662727 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.38652482269503546, "acc_stderr": 0.029049190342543472, "acc_norm": 0.38652482269503546, "acc_norm_stderr": 0.029049190342543472 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.40091264667535853, "acc_stderr": 0.01251696035064082, "acc_norm": 0.40091264667535853, "acc_norm_stderr": 0.01251696035064082 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.5919117647058824, "acc_stderr": 0.029855261393483927, "acc_norm": 0.5919117647058824, "acc_norm_stderr": 0.029855261393483927 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.545751633986928, "acc_stderr": 0.0201429745537952, "acc_norm": 0.545751633986928, "acc_norm_stderr": 0.0201429745537952 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6454545454545455, "acc_stderr": 0.04582004841505417, "acc_norm": 0.6454545454545455, "acc_norm_stderr": 0.04582004841505417 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.6653061224489796, "acc_stderr": 0.030209235226242307, "acc_norm": 0.6653061224489796, "acc_norm_stderr": 0.030209235226242307 }, "harness|hendrycksTest-sociology|5": { "acc": 0.746268656716418, "acc_stderr": 0.03076944496729602, "acc_norm": 0.746268656716418, "acc_norm_stderr": 0.03076944496729602 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.81, "acc_stderr": 0.039427724440366234, "acc_norm": 0.81, "acc_norm_stderr": 0.039427724440366234 }, "harness|hendrycksTest-virology|5": { "acc": 0.4457831325301205, "acc_stderr": 0.03869543323472101, "acc_norm": 0.4457831325301205, "acc_norm_stderr": 0.03869543323472101 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.7309941520467836, "acc_stderr": 0.03401052620104089, "acc_norm": 0.7309941520467836, "acc_norm_stderr": 0.03401052620104089 }, "harness|truthfulqa:mc|0": { "mc1": 0.408812729498164, "mc1_stderr": 0.01720995215164173, "mc2": 0.5794497843139403, "mc2_stderr": 0.015364305691423665 }, "harness|winogrande|5": { "acc": 0.734017363851618, "acc_stderr": 0.01241832315305105 }, "harness|gsm8k|5": { "acc": 0.2987111448066717, "acc_stderr": 0.012607137125693639 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_kwchoi__DPO_mistral_v01_7b_ultra_0131_1k_1epoch
[ "region:us" ]
2024-02-02T04:01:57+00:00
{"pretty_name": "Evaluation run of kwchoi/DPO_mistral_v01_7b_ultra_0131_1k_1epoch", "dataset_summary": "Dataset automatically created during the evaluation run of model [kwchoi/DPO_mistral_v01_7b_ultra_0131_1k_1epoch](https://huggingface.co/kwchoi/DPO_mistral_v01_7b_ultra_0131_1k_1epoch) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_kwchoi__DPO_mistral_v01_7b_ultra_0131_1k_1epoch\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-02T03:59:38.771636](https://huggingface.co/datasets/open-llm-leaderboard/details_kwchoi__DPO_mistral_v01_7b_ultra_0131_1k_1epoch/blob/main/results_2024-02-02T03-59-38.771636.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5581636704911594,\n \"acc_stderr\": 0.03393607257808976,\n \"acc_norm\": 0.5631863759377935,\n \"acc_norm_stderr\": 0.034649068536847794,\n \"mc1\": 0.408812729498164,\n \"mc1_stderr\": 0.01720995215164173,\n \"mc2\": 0.5794497843139403,\n \"mc2_stderr\": 0.015364305691423665\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5366894197952219,\n \"acc_stderr\": 0.01457200052775699,\n \"acc_norm\": 0.5597269624573379,\n \"acc_norm_stderr\": 0.014506769524804234\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5780720971917944,\n \"acc_stderr\": 0.004928578106026369,\n \"acc_norm\": 0.7677753435570603,\n \"acc_norm_stderr\": 0.004213885798268822\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.047609522856952365,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.047609522856952365\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.45185185185185184,\n \"acc_stderr\": 0.04299268905480864,\n \"acc_norm\": 0.45185185185185184,\n \"acc_norm_stderr\": 0.04299268905480864\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.5723684210526315,\n \"acc_stderr\": 0.04026097083296563,\n \"acc_norm\": 0.5723684210526315,\n \"acc_norm_stderr\": 0.04026097083296563\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.55,\n \"acc_stderr\": 0.04999999999999999,\n \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.04999999999999999\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.5962264150943396,\n \"acc_stderr\": 0.03019761160019795,\n \"acc_norm\": 0.5962264150943396,\n \"acc_norm_stderr\": 0.03019761160019795\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.625,\n \"acc_stderr\": 0.04048439222695598,\n \"acc_norm\": 0.625,\n \"acc_norm_stderr\": 0.04048439222695598\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.41,\n \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.53,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5433526011560693,\n \"acc_stderr\": 0.03798106566014498,\n \"acc_norm\": 0.5433526011560693,\n \"acc_norm_stderr\": 0.03798106566014498\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.3137254901960784,\n \"acc_stderr\": 0.04617034827006718,\n \"acc_norm\": 0.3137254901960784,\n \"acc_norm_stderr\": 0.04617034827006718\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5106382978723404,\n \"acc_stderr\": 0.03267862331014063,\n \"acc_norm\": 0.5106382978723404,\n \"acc_norm_stderr\": 0.03267862331014063\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.38596491228070173,\n \"acc_stderr\": 0.045796394220704334,\n \"acc_norm\": 0.38596491228070173,\n \"acc_norm_stderr\": 0.045796394220704334\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5448275862068965,\n \"acc_stderr\": 0.04149886942192117,\n \"acc_norm\": 0.5448275862068965,\n \"acc_norm_stderr\": 0.04149886942192117\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.36243386243386244,\n \"acc_stderr\": 0.024757473902752045,\n \"acc_norm\": 0.36243386243386244,\n \"acc_norm_stderr\": 0.024757473902752045\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.38095238095238093,\n \"acc_stderr\": 0.043435254289490965,\n \"acc_norm\": 0.38095238095238093,\n \"acc_norm_stderr\": 0.043435254289490965\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6451612903225806,\n \"acc_stderr\": 0.027218889773308753,\n \"acc_norm\": 0.6451612903225806,\n \"acc_norm_stderr\": 0.027218889773308753\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.39901477832512317,\n \"acc_stderr\": 0.03445487686264715,\n \"acc_norm\": 0.39901477832512317,\n \"acc_norm_stderr\": 0.03445487686264715\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.6787878787878788,\n \"acc_stderr\": 0.036462049632538115,\n \"acc_norm\": 0.6787878787878788,\n \"acc_norm_stderr\": 0.036462049632538115\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7171717171717171,\n \"acc_stderr\": 0.032087795587867514,\n \"acc_norm\": 0.7171717171717171,\n \"acc_norm_stderr\": 0.032087795587867514\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.7305699481865285,\n \"acc_stderr\": 0.03201867122877794,\n \"acc_norm\": 0.7305699481865285,\n \"acc_norm_stderr\": 0.03201867122877794\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.5307692307692308,\n \"acc_stderr\": 0.025302958890850154,\n \"acc_norm\": 0.5307692307692308,\n \"acc_norm_stderr\": 0.025302958890850154\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.027940457136228416,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.027940457136228416\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.5504201680672269,\n \"acc_stderr\": 0.03231293497137707,\n \"acc_norm\": 0.5504201680672269,\n \"acc_norm_stderr\": 0.03231293497137707\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.32450331125827814,\n \"acc_stderr\": 0.03822746937658753,\n \"acc_norm\": 0.32450331125827814,\n \"acc_norm_stderr\": 0.03822746937658753\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.7247706422018348,\n \"acc_stderr\": 0.019149093743155203,\n \"acc_norm\": 0.7247706422018348,\n \"acc_norm_stderr\": 0.019149093743155203\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4675925925925926,\n \"acc_stderr\": 0.034028015813589656,\n \"acc_norm\": 0.4675925925925926,\n \"acc_norm_stderr\": 0.034028015813589656\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.696078431372549,\n \"acc_stderr\": 0.03228210387037892,\n \"acc_norm\": 0.696078431372549,\n \"acc_norm_stderr\": 0.03228210387037892\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.6962025316455697,\n \"acc_stderr\": 0.029936696387138598,\n \"acc_norm\": 0.6962025316455697,\n \"acc_norm_stderr\": 0.029936696387138598\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6591928251121076,\n \"acc_stderr\": 0.0318114974705536,\n \"acc_norm\": 0.6591928251121076,\n \"acc_norm_stderr\": 0.0318114974705536\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.6717557251908397,\n \"acc_stderr\": 0.04118438565806298,\n \"acc_norm\": 0.6717557251908397,\n \"acc_norm_stderr\": 0.04118438565806298\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.6776859504132231,\n \"acc_stderr\": 0.042664163633521685,\n \"acc_norm\": 0.6776859504132231,\n \"acc_norm_stderr\": 0.042664163633521685\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7037037037037037,\n \"acc_stderr\": 0.04414343666854933,\n \"acc_norm\": 0.7037037037037037,\n \"acc_norm_stderr\": 0.04414343666854933\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.6748466257668712,\n \"acc_stderr\": 0.03680350371286461,\n \"acc_norm\": 0.6748466257668712,\n \"acc_norm_stderr\": 0.03680350371286461\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4732142857142857,\n \"acc_stderr\": 0.047389751192741546,\n \"acc_norm\": 0.4732142857142857,\n \"acc_norm_stderr\": 0.047389751192741546\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.6796116504854369,\n \"acc_stderr\": 0.04620284082280041,\n \"acc_norm\": 0.6796116504854369,\n \"acc_norm_stderr\": 0.04620284082280041\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8547008547008547,\n \"acc_stderr\": 0.023086635086841407,\n \"acc_norm\": 0.8547008547008547,\n \"acc_norm_stderr\": 0.023086635086841407\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7445721583652618,\n \"acc_stderr\": 0.015594955384455772,\n \"acc_norm\": 0.7445721583652618,\n \"acc_norm_stderr\": 0.015594955384455772\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6040462427745664,\n \"acc_stderr\": 0.026329813341946243,\n \"acc_norm\": 0.6040462427745664,\n \"acc_norm_stderr\": 0.026329813341946243\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23910614525139665,\n \"acc_stderr\": 0.014265554192331152,\n \"acc_norm\": 0.23910614525139665,\n \"acc_norm_stderr\": 0.014265554192331152\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6143790849673203,\n \"acc_stderr\": 0.027870745278290268,\n \"acc_norm\": 0.6143790849673203,\n \"acc_norm_stderr\": 0.027870745278290268\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6237942122186495,\n \"acc_stderr\": 0.02751392568354943,\n \"acc_norm\": 0.6237942122186495,\n \"acc_norm_stderr\": 0.02751392568354943\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.5925925925925926,\n \"acc_stderr\": 0.027339546640662727,\n \"acc_norm\": 0.5925925925925926,\n \"acc_norm_stderr\": 0.027339546640662727\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.38652482269503546,\n \"acc_stderr\": 0.029049190342543472,\n \"acc_norm\": 0.38652482269503546,\n \"acc_norm_stderr\": 0.029049190342543472\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.40091264667535853,\n \"acc_stderr\": 0.01251696035064082,\n \"acc_norm\": 0.40091264667535853,\n \"acc_norm_stderr\": 0.01251696035064082\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.5919117647058824,\n \"acc_stderr\": 0.029855261393483927,\n \"acc_norm\": 0.5919117647058824,\n \"acc_norm_stderr\": 0.029855261393483927\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.545751633986928,\n \"acc_stderr\": 0.0201429745537952,\n \"acc_norm\": 0.545751633986928,\n \"acc_norm_stderr\": 0.0201429745537952\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6454545454545455,\n \"acc_stderr\": 0.04582004841505417,\n \"acc_norm\": 0.6454545454545455,\n \"acc_norm_stderr\": 0.04582004841505417\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.6653061224489796,\n \"acc_stderr\": 0.030209235226242307,\n \"acc_norm\": 0.6653061224489796,\n \"acc_norm_stderr\": 0.030209235226242307\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.746268656716418,\n \"acc_stderr\": 0.03076944496729602,\n \"acc_norm\": 0.746268656716418,\n \"acc_norm_stderr\": 0.03076944496729602\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.81,\n \"acc_stderr\": 0.039427724440366234,\n \"acc_norm\": 0.81,\n \"acc_norm_stderr\": 0.039427724440366234\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4457831325301205,\n \"acc_stderr\": 0.03869543323472101,\n \"acc_norm\": 0.4457831325301205,\n \"acc_norm_stderr\": 0.03869543323472101\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.7309941520467836,\n \"acc_stderr\": 0.03401052620104089,\n \"acc_norm\": 0.7309941520467836,\n \"acc_norm_stderr\": 0.03401052620104089\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.408812729498164,\n \"mc1_stderr\": 0.01720995215164173,\n \"mc2\": 0.5794497843139403,\n \"mc2_stderr\": 0.015364305691423665\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.734017363851618,\n \"acc_stderr\": 0.01241832315305105\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.2987111448066717,\n \"acc_stderr\": 0.012607137125693639\n }\n}\n```", "repo_url": "https://huggingface.co/kwchoi/DPO_mistral_v01_7b_ultra_0131_1k_1epoch", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_02T03_59_38.771636", "path": ["**/details_harness|arc:challenge|25_2024-02-02T03-59-38.771636.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-02T03-59-38.771636.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_02T03_59_38.771636", "path": ["**/details_harness|gsm8k|5_2024-02-02T03-59-38.771636.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-02T03-59-38.771636.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_02T03_59_38.771636", "path": ["**/details_harness|hellaswag|10_2024-02-02T03-59-38.771636.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-02T03-59-38.771636.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_02T03_59_38.771636", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T03-59-38.771636.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-02T03-59-38.771636.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-02T03-59-38.771636.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T03-59-38.771636.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T03-59-38.771636.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-02T03-59-38.771636.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T03-59-38.771636.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T03-59-38.771636.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T03-59-38.771636.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T03-59-38.771636.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-02T03-59-38.771636.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-02T03-59-38.771636.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T03-59-38.771636.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-02T03-59-38.771636.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T03-59-38.771636.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T03-59-38.771636.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T03-59-38.771636.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-02T03-59-38.771636.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T03-59-38.771636.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T03-59-38.771636.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T03-59-38.771636.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T03-59-38.771636.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T03-59-38.771636.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T03-59-38.771636.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T03-59-38.771636.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T03-59-38.771636.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T03-59-38.771636.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T03-59-38.771636.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T03-59-38.771636.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T03-59-38.771636.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T03-59-38.771636.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T03-59-38.771636.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-02T03-59-38.771636.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T03-59-38.771636.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-02T03-59-38.771636.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T03-59-38.771636.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T03-59-38.771636.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T03-59-38.771636.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-02T03-59-38.771636.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-02T03-59-38.771636.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T03-59-38.771636.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T03-59-38.771636.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T03-59-38.771636.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T03-59-38.771636.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-02T03-59-38.771636.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-02T03-59-38.771636.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-02T03-59-38.771636.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T03-59-38.771636.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-02T03-59-38.771636.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T03-59-38.771636.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T03-59-38.771636.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-02T03-59-38.771636.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-02T03-59-38.771636.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-02T03-59-38.771636.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T03-59-38.771636.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-02T03-59-38.771636.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-02T03-59-38.771636.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T03-59-38.771636.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-02T03-59-38.771636.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-02T03-59-38.771636.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T03-59-38.771636.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T03-59-38.771636.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-02T03-59-38.771636.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T03-59-38.771636.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T03-59-38.771636.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T03-59-38.771636.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T03-59-38.771636.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-02T03-59-38.771636.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-02T03-59-38.771636.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T03-59-38.771636.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-02T03-59-38.771636.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T03-59-38.771636.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T03-59-38.771636.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T03-59-38.771636.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-02T03-59-38.771636.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T03-59-38.771636.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T03-59-38.771636.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T03-59-38.771636.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T03-59-38.771636.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T03-59-38.771636.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T03-59-38.771636.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T03-59-38.771636.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T03-59-38.771636.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T03-59-38.771636.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T03-59-38.771636.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T03-59-38.771636.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T03-59-38.771636.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T03-59-38.771636.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T03-59-38.771636.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-02T03-59-38.771636.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T03-59-38.771636.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-02T03-59-38.771636.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T03-59-38.771636.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T03-59-38.771636.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T03-59-38.771636.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-02T03-59-38.771636.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-02T03-59-38.771636.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T03-59-38.771636.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T03-59-38.771636.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T03-59-38.771636.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T03-59-38.771636.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-02T03-59-38.771636.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-02T03-59-38.771636.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-02T03-59-38.771636.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T03-59-38.771636.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-02T03-59-38.771636.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T03-59-38.771636.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T03-59-38.771636.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-02T03-59-38.771636.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-02T03-59-38.771636.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-02T03-59-38.771636.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T03-59-38.771636.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-02T03-59-38.771636.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-02T03-59-38.771636.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_02T03_59_38.771636", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T03-59-38.771636.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T03-59-38.771636.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_02T03_59_38.771636", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-02T03-59-38.771636.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-02T03-59-38.771636.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_02T03_59_38.771636", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-02T03-59-38.771636.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-02T03-59-38.771636.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_02T03_59_38.771636", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T03-59-38.771636.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T03-59-38.771636.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_02T03_59_38.771636", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T03-59-38.771636.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T03-59-38.771636.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_02T03_59_38.771636", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-02T03-59-38.771636.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-02T03-59-38.771636.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_02T03_59_38.771636", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T03-59-38.771636.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T03-59-38.771636.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_02T03_59_38.771636", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T03-59-38.771636.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T03-59-38.771636.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_02T03_59_38.771636", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T03-59-38.771636.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T03-59-38.771636.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_02T03_59_38.771636", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T03-59-38.771636.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T03-59-38.771636.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_02T03_59_38.771636", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-02T03-59-38.771636.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-02T03-59-38.771636.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_02T03_59_38.771636", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-02T03-59-38.771636.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-02T03-59-38.771636.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_02T03_59_38.771636", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T03-59-38.771636.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T03-59-38.771636.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_02T03_59_38.771636", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-02T03-59-38.771636.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-02T03-59-38.771636.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_02T03_59_38.771636", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T03-59-38.771636.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T03-59-38.771636.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_02T03_59_38.771636", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T03-59-38.771636.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T03-59-38.771636.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_02T03_59_38.771636", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T03-59-38.771636.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T03-59-38.771636.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_02T03_59_38.771636", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-02T03-59-38.771636.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-02T03-59-38.771636.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_02T03_59_38.771636", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T03-59-38.771636.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T03-59-38.771636.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_02T03_59_38.771636", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T03-59-38.771636.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T03-59-38.771636.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_02T03_59_38.771636", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T03-59-38.771636.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T03-59-38.771636.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_02T03_59_38.771636", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T03-59-38.771636.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T03-59-38.771636.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_02T03_59_38.771636", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T03-59-38.771636.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T03-59-38.771636.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_02T03_59_38.771636", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T03-59-38.771636.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T03-59-38.771636.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_02T03_59_38.771636", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T03-59-38.771636.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T03-59-38.771636.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_02T03_59_38.771636", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T03-59-38.771636.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T03-59-38.771636.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_02T03_59_38.771636", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T03-59-38.771636.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T03-59-38.771636.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_02T03_59_38.771636", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T03-59-38.771636.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T03-59-38.771636.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_02T03_59_38.771636", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T03-59-38.771636.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T03-59-38.771636.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_02T03_59_38.771636", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T03-59-38.771636.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T03-59-38.771636.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_02T03_59_38.771636", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T03-59-38.771636.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T03-59-38.771636.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_02T03_59_38.771636", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T03-59-38.771636.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T03-59-38.771636.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_02T03_59_38.771636", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-02T03-59-38.771636.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-02T03-59-38.771636.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_02T03_59_38.771636", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T03-59-38.771636.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T03-59-38.771636.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_02T03_59_38.771636", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-02T03-59-38.771636.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-02T03-59-38.771636.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_02T03_59_38.771636", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T03-59-38.771636.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T03-59-38.771636.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_02T03_59_38.771636", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T03-59-38.771636.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T03-59-38.771636.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_02T03_59_38.771636", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T03-59-38.771636.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T03-59-38.771636.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_02T03_59_38.771636", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-02T03-59-38.771636.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-02T03-59-38.771636.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_02T03_59_38.771636", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-02T03-59-38.771636.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-02T03-59-38.771636.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_02T03_59_38.771636", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T03-59-38.771636.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T03-59-38.771636.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_02T03_59_38.771636", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T03-59-38.771636.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T03-59-38.771636.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_02T03_59_38.771636", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T03-59-38.771636.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T03-59-38.771636.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_02T03_59_38.771636", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T03-59-38.771636.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T03-59-38.771636.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_02T03_59_38.771636", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-02T03-59-38.771636.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-02T03-59-38.771636.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_02T03_59_38.771636", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-02T03-59-38.771636.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-02T03-59-38.771636.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_02T03_59_38.771636", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-02T03-59-38.771636.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-02T03-59-38.771636.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_02T03_59_38.771636", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T03-59-38.771636.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T03-59-38.771636.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_02T03_59_38.771636", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-02T03-59-38.771636.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-02T03-59-38.771636.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_02T03_59_38.771636", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T03-59-38.771636.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T03-59-38.771636.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_02T03_59_38.771636", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T03-59-38.771636.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T03-59-38.771636.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_02T03_59_38.771636", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-02T03-59-38.771636.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-02T03-59-38.771636.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_02T03_59_38.771636", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-02T03-59-38.771636.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-02T03-59-38.771636.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_02T03_59_38.771636", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-02T03-59-38.771636.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-02T03-59-38.771636.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_02T03_59_38.771636", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T03-59-38.771636.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T03-59-38.771636.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_02T03_59_38.771636", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-02T03-59-38.771636.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-02T03-59-38.771636.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_02T03_59_38.771636", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-02T03-59-38.771636.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-02T03-59-38.771636.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_02T03_59_38.771636", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-02T03-59-38.771636.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-02T03-59-38.771636.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_02T03_59_38.771636", "path": ["**/details_harness|winogrande|5_2024-02-02T03-59-38.771636.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-02T03-59-38.771636.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_02T03_59_38.771636", "path": ["results_2024-02-02T03-59-38.771636.parquet"]}, {"split": "latest", "path": ["results_2024-02-02T03-59-38.771636.parquet"]}]}]}
2024-02-02T04:02:26+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of kwchoi/DPO_mistral_v01_7b_ultra_0131_1k_1epoch Dataset automatically created during the evaluation run of model kwchoi/DPO_mistral_v01_7b_ultra_0131_1k_1epoch on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-02-02T03:59:38.771636(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of kwchoi/DPO_mistral_v01_7b_ultra_0131_1k_1epoch\n\n\n\nDataset automatically created during the evaluation run of model kwchoi/DPO_mistral_v01_7b_ultra_0131_1k_1epoch on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-02-02T03:59:38.771636(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of kwchoi/DPO_mistral_v01_7b_ultra_0131_1k_1epoch\n\n\n\nDataset automatically created during the evaluation run of model kwchoi/DPO_mistral_v01_7b_ultra_0131_1k_1epoch on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-02-02T03:59:38.771636(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
9c3812259fc64591126a3803da93c0274efd617f
# Dataset Card for "Quilt_VQA" **Paper: Quilt-LLaVA: Visual Instruction Tuning by Extracting Localized Narratives from Open-Source Histopathology Videos** **Paper or resources for more information:** https://quilt-llava.github.io/ <p align="center"> <img src="https://quilt-llava.github.io/static/images/quilt_vqa_samples.png" alt="fig2" width="90%"/> </p> **Description and Details** To evaluate Quilt-LLaVA, alongside public VQA pathology datasets, we also generated Quilt-VQA by extracting Q&A dataset from naturally occurring questions/answers given in the videos. With the help of GPT4 and some handcrafted algorithms, we collect a rich evaluation dataset of 1283 Q&A pairs. Top two rows show image-dependent Q&A pairs and bottom two rows show general-knowledge Q&A pairs. The original question posed by the narrator of the video is highlighted in yellow. **Dataset date:** QUILT-VQA was collected in November 2023. **License:** MIT License; **Where to send questions or comments about the model:** https://github.com/quilt-llava/quilt-llava.github.io/issues **Primary intended uses:** The primary use of QUILT-VQA is for benchmarking histopathology large multimodal models and chatbots. **Primary intended users:** The dataset is intended as a research resource for research communities. We hope that this dataset will enable researchers to better understand and explore the generative capacity of medical large multimodal models **Citation** ```bibtex @misc{seyfioglu2023quiltllava, title={Quilt-LLaVA: Visual Instruction Tuning by Extracting Localized Narratives from Open-Source Histopathology Videos}, author={Mehmet Saygin Seyfioglu and Wisdom O. Ikezogwo and Fatemeh Ghezloo and Ranjay Krishna and Linda Shapiro}, year={2023}, eprint={2312.04746}, archivePrefix={arXiv}, primaryClass={cs.CV} } ``` ```bibtex @misc{ikezogwo2023quilt1m, title={Quilt-1M: One Million Image-Text Pairs for Histopathology}, author={Wisdom Oluchi Ikezogwo and Mehmet Saygin Seyfioglu and Fatemeh Ghezloo and Dylan Stefan Chan Geva and Fatwir Sheikh Mohammed and Pavan Kumar Anand and Ranjay Krishna and Linda Shapiro}, year={2023}, eprint={2306.11207}, archivePrefix={arXiv}, primaryClass={cs.CV} } ```
wisdomik/Quilt_VQA
[ "arxiv:2312.04746", "arxiv:2306.11207", "region:us" ]
2024-02-02T04:02:09+00:00
{"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}], "dataset_info": {"features": [{"name": "image", "dtype": "image"}, {"name": "question", "dtype": "string"}, {"name": "answer", "dtype": "string"}, {"name": "answer_type", "dtype": "string"}, {"name": "context", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 225575327.0, "num_examples": 985}], "download_size": 222944861, "dataset_size": 225575327.0}, "extra_gated_prompt": "Please read and agree to the following terms: 1. The requester details provided are not faked. 2. The resource will not be used for commercial/clinical purposes and will be used for scientific research only. 3. The data will not be re-distributed, published, copied, or further disseminated in any way or form whatsoever, whether for profit or not. 4. The right study/paper (Quilt-1M(https://quilt1m.github.io/) and Quilt-LLaVa (https://quilt-llava.github.io) papers) will be cited in any publication(s) that uses this model/data ", "extra_gated_fields": {"Email": "text", "First and last name": "text", "Affiliation": "text", "Type of Affiliation": {"type": "select", "options": ["Academia", "Industry", "Other"]}, "I want to use this model for": {"type": "select", "options": ["Research", "Education", {"label": "Other", "value": "other"}]}, "I agree to the aforementioned terms of use": "checkbox"}}
2024-02-14T21:53:30+00:00
[ "2312.04746", "2306.11207" ]
[]
TAGS #arxiv-2312.04746 #arxiv-2306.11207 #region-us
# Dataset Card for "Quilt_VQA" Paper: Quilt-LLaVA: Visual Instruction Tuning by Extracting Localized Narratives from Open-Source Histopathology Videos Paper or resources for more information: URL <p align="center"> <img src="URL alt="fig2" width="90%"/> </p> Description and Details To evaluate Quilt-LLaVA, alongside public VQA pathology datasets, we also generated Quilt-VQA by extracting Q&A dataset from naturally occurring questions/answers given in the videos. With the help of GPT4 and some handcrafted algorithms, we collect a rich evaluation dataset of 1283 Q&A pairs. Top two rows show image-dependent Q&A pairs and bottom two rows show general-knowledge Q&A pairs. The original question posed by the narrator of the video is highlighted in yellow. Dataset date: QUILT-VQA was collected in November 2023. License: MIT License; Where to send questions or comments about the model: URL Primary intended uses: The primary use of QUILT-VQA is for benchmarking histopathology large multimodal models and chatbots. Primary intended users: The dataset is intended as a research resource for research communities. We hope that this dataset will enable researchers to better understand and explore the generative capacity of medical large multimodal models Citation
[ "# Dataset Card for \"Quilt_VQA\"\n\n\nPaper: Quilt-LLaVA: Visual Instruction Tuning by Extracting Localized Narratives from Open-Source Histopathology Videos\n\nPaper or resources for more information:\nURL\n\n\n<p align=\"center\">\n <img src=\"URL alt=\"fig2\" width=\"90%\"/>\n</p>\n\n\n\nDescription and Details\nTo evaluate Quilt-LLaVA, alongside public VQA pathology datasets, we also generated Quilt-VQA by extracting Q&A dataset from naturally occurring questions/answers given in the videos. With the help of GPT4 and some handcrafted algorithms, we collect a rich evaluation dataset of 1283 Q&A pairs. Top two rows show image-dependent Q&A pairs and bottom two rows show general-knowledge Q&A pairs. The original question posed by the narrator of the video is highlighted in yellow.\n\n\nDataset date:\nQUILT-VQA was collected in November 2023.\n\nLicense:\nMIT License;\n\nWhere to send questions or comments about the model:\nURL\n\nPrimary intended uses:\nThe primary use of QUILT-VQA is for benchmarking histopathology large multimodal models and chatbots.\n\nPrimary intended users:\nThe dataset is intended as a research resource for research communities. We hope that this dataset will enable researchers to better understand and explore the generative capacity of medical large multimodal models\n\nCitation" ]
[ "TAGS\n#arxiv-2312.04746 #arxiv-2306.11207 #region-us \n", "# Dataset Card for \"Quilt_VQA\"\n\n\nPaper: Quilt-LLaVA: Visual Instruction Tuning by Extracting Localized Narratives from Open-Source Histopathology Videos\n\nPaper or resources for more information:\nURL\n\n\n<p align=\"center\">\n <img src=\"URL alt=\"fig2\" width=\"90%\"/>\n</p>\n\n\n\nDescription and Details\nTo evaluate Quilt-LLaVA, alongside public VQA pathology datasets, we also generated Quilt-VQA by extracting Q&A dataset from naturally occurring questions/answers given in the videos. With the help of GPT4 and some handcrafted algorithms, we collect a rich evaluation dataset of 1283 Q&A pairs. Top two rows show image-dependent Q&A pairs and bottom two rows show general-knowledge Q&A pairs. The original question posed by the narrator of the video is highlighted in yellow.\n\n\nDataset date:\nQUILT-VQA was collected in November 2023.\n\nLicense:\nMIT License;\n\nWhere to send questions or comments about the model:\nURL\n\nPrimary intended uses:\nThe primary use of QUILT-VQA is for benchmarking histopathology large multimodal models and chatbots.\n\nPrimary intended users:\nThe dataset is intended as a research resource for research communities. We hope that this dataset will enable researchers to better understand and explore the generative capacity of medical large multimodal models\n\nCitation" ]
3ea979cff54b19101e31a6ab123e6406752e9ea1
# Dataset Card for Evaluation run of CultriX/Wernicke-7B-dpo <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [CultriX/Wernicke-7B-dpo](https://huggingface.co/CultriX/Wernicke-7B-dpo) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_CultriX__Wernicke-7B-dpo", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-02-02T04:20:39.434193](https://huggingface.co/datasets/open-llm-leaderboard/details_CultriX__Wernicke-7B-dpo/blob/main/results_2024-02-02T04-20-39.434193.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6575013086727888, "acc_stderr": 0.03194701943230958, "acc_norm": 0.6572954814106298, "acc_norm_stderr": 0.03261251839195054, "mc1": 0.5862913096695227, "mc1_stderr": 0.0172408618120998, "mc2": 0.7390753698182038, "mc2_stderr": 0.014550317819192169 }, "harness|arc:challenge|25": { "acc": 0.7013651877133106, "acc_stderr": 0.013374078615068744, "acc_norm": 0.7184300341296929, "acc_norm_stderr": 0.013143376735009022 }, "harness|hellaswag|10": { "acc": 0.7081258713403704, "acc_stderr": 0.004536955796510544, "acc_norm": 0.8862776339374626, "acc_norm_stderr": 0.0031682493518893065 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.34, "acc_stderr": 0.04760952285695235, "acc_norm": 0.34, "acc_norm_stderr": 0.04760952285695235 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.6666666666666666, "acc_stderr": 0.04072314811876837, "acc_norm": 0.6666666666666666, "acc_norm_stderr": 0.04072314811876837 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.6973684210526315, "acc_stderr": 0.03738520676119669, "acc_norm": 0.6973684210526315, "acc_norm_stderr": 0.03738520676119669 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.63, "acc_stderr": 0.04852365870939099, "acc_norm": 0.63, "acc_norm_stderr": 0.04852365870939099 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.7169811320754716, "acc_stderr": 0.027724236492700914, "acc_norm": 0.7169811320754716, "acc_norm_stderr": 0.027724236492700914 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.7916666666666666, "acc_stderr": 0.033961162058453336, "acc_norm": 0.7916666666666666, "acc_norm_stderr": 0.033961162058453336 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.46, "acc_stderr": 0.05009082659620333, "acc_norm": 0.46, "acc_norm_stderr": 0.05009082659620333 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.54, "acc_stderr": 0.05009082659620333, "acc_norm": 0.54, "acc_norm_stderr": 0.05009082659620333 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.32, "acc_stderr": 0.04688261722621504, "acc_norm": 0.32, "acc_norm_stderr": 0.04688261722621504 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6705202312138728, "acc_stderr": 0.03583901754736412, "acc_norm": 0.6705202312138728, "acc_norm_stderr": 0.03583901754736412 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.4411764705882353, "acc_stderr": 0.049406356306056595, "acc_norm": 0.4411764705882353, "acc_norm_stderr": 0.049406356306056595 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.74, "acc_stderr": 0.04408440022768077, "acc_norm": 0.74, "acc_norm_stderr": 0.04408440022768077 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.574468085106383, "acc_stderr": 0.03232146916224468, "acc_norm": 0.574468085106383, "acc_norm_stderr": 0.03232146916224468 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.5, "acc_stderr": 0.047036043419179864, "acc_norm": 0.5, "acc_norm_stderr": 0.047036043419179864 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5448275862068965, "acc_stderr": 0.04149886942192117, "acc_norm": 0.5448275862068965, "acc_norm_stderr": 0.04149886942192117 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.43915343915343913, "acc_stderr": 0.025559920550531, "acc_norm": 0.43915343915343913, "acc_norm_stderr": 0.025559920550531 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.48412698412698413, "acc_stderr": 0.04469881854072606, "acc_norm": 0.48412698412698413, "acc_norm_stderr": 0.04469881854072606 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.31, "acc_stderr": 0.04648231987117316, "acc_norm": 0.31, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.7838709677419354, "acc_stderr": 0.023415293433568525, "acc_norm": 0.7838709677419354, "acc_norm_stderr": 0.023415293433568525 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.5123152709359606, "acc_stderr": 0.035169204442208966, "acc_norm": 0.5123152709359606, "acc_norm_stderr": 0.035169204442208966 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.71, "acc_stderr": 0.045604802157206845, "acc_norm": 0.71, "acc_norm_stderr": 0.045604802157206845 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7757575757575758, "acc_stderr": 0.03256866661681102, "acc_norm": 0.7757575757575758, "acc_norm_stderr": 0.03256866661681102 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.7828282828282829, "acc_stderr": 0.029376616484945633, "acc_norm": 0.7828282828282829, "acc_norm_stderr": 0.029376616484945633 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.9119170984455959, "acc_stderr": 0.02045374660160103, "acc_norm": 0.9119170984455959, "acc_norm_stderr": 0.02045374660160103 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.6820512820512821, "acc_stderr": 0.023610884308927865, "acc_norm": 0.6820512820512821, "acc_norm_stderr": 0.023610884308927865 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.3592592592592593, "acc_stderr": 0.029252905927251972, "acc_norm": 0.3592592592592593, "acc_norm_stderr": 0.029252905927251972 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.6722689075630253, "acc_stderr": 0.03048991141767323, "acc_norm": 0.6722689075630253, "acc_norm_stderr": 0.03048991141767323 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.3576158940397351, "acc_stderr": 0.03913453431177258, "acc_norm": 0.3576158940397351, "acc_norm_stderr": 0.03913453431177258 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8422018348623853, "acc_stderr": 0.01563002297009244, "acc_norm": 0.8422018348623853, "acc_norm_stderr": 0.01563002297009244 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.5092592592592593, "acc_stderr": 0.034093869469927006, "acc_norm": 0.5092592592592593, "acc_norm_stderr": 0.034093869469927006 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.8382352941176471, "acc_stderr": 0.02584501798692692, "acc_norm": 0.8382352941176471, "acc_norm_stderr": 0.02584501798692692 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.8185654008438819, "acc_stderr": 0.02508596114457966, "acc_norm": 0.8185654008438819, "acc_norm_stderr": 0.02508596114457966 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.695067264573991, "acc_stderr": 0.030898610882477515, "acc_norm": 0.695067264573991, "acc_norm_stderr": 0.030898610882477515 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.8015267175572519, "acc_stderr": 0.034981493854624714, "acc_norm": 0.8015267175572519, "acc_norm_stderr": 0.034981493854624714 }, "harness|hendrycksTest-international_law|5": { "acc": 0.7851239669421488, "acc_stderr": 0.037494924487096966, "acc_norm": 0.7851239669421488, "acc_norm_stderr": 0.037494924487096966 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7777777777777778, "acc_stderr": 0.0401910747255735, "acc_norm": 0.7777777777777778, "acc_norm_stderr": 0.0401910747255735 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.754601226993865, "acc_stderr": 0.03380939813943354, "acc_norm": 0.754601226993865, "acc_norm_stderr": 0.03380939813943354 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.5089285714285714, "acc_stderr": 0.04745033255489123, "acc_norm": 0.5089285714285714, "acc_norm_stderr": 0.04745033255489123 }, "harness|hendrycksTest-management|5": { "acc": 0.7961165048543689, "acc_stderr": 0.039891398595317706, "acc_norm": 0.7961165048543689, "acc_norm_stderr": 0.039891398595317706 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8717948717948718, "acc_stderr": 0.021901905115073325, "acc_norm": 0.8717948717948718, "acc_norm_stderr": 0.021901905115073325 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.74, "acc_stderr": 0.04408440022768079, "acc_norm": 0.74, "acc_norm_stderr": 0.04408440022768079 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.822477650063857, "acc_stderr": 0.013664230995834841, "acc_norm": 0.822477650063857, "acc_norm_stderr": 0.013664230995834841 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.7456647398843931, "acc_stderr": 0.023445826276545543, "acc_norm": 0.7456647398843931, "acc_norm_stderr": 0.023445826276545543 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.40558659217877097, "acc_stderr": 0.016421670506339178, "acc_norm": 0.40558659217877097, "acc_norm_stderr": 0.016421670506339178 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.7222222222222222, "acc_stderr": 0.025646863097137897, "acc_norm": 0.7222222222222222, "acc_norm_stderr": 0.025646863097137897 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.7202572347266881, "acc_stderr": 0.02549425935069491, "acc_norm": 0.7202572347266881, "acc_norm_stderr": 0.02549425935069491 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.7623456790123457, "acc_stderr": 0.02368359183700856, "acc_norm": 0.7623456790123457, "acc_norm_stderr": 0.02368359183700856 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.48936170212765956, "acc_stderr": 0.029820747191422473, "acc_norm": 0.48936170212765956, "acc_norm_stderr": 0.029820747191422473 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.46284224250325945, "acc_stderr": 0.012734923579532069, "acc_norm": 0.46284224250325945, "acc_norm_stderr": 0.012734923579532069 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.6838235294117647, "acc_stderr": 0.028245687391462923, "acc_norm": 0.6838235294117647, "acc_norm_stderr": 0.028245687391462923 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6911764705882353, "acc_stderr": 0.018690850273595284, "acc_norm": 0.6911764705882353, "acc_norm_stderr": 0.018690850273595284 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.7, "acc_stderr": 0.04389311454644286, "acc_norm": 0.7, "acc_norm_stderr": 0.04389311454644286 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.726530612244898, "acc_stderr": 0.028535560337128445, "acc_norm": 0.726530612244898, "acc_norm_stderr": 0.028535560337128445 }, "harness|hendrycksTest-sociology|5": { "acc": 0.8407960199004975, "acc_stderr": 0.025870646766169136, "acc_norm": 0.8407960199004975, "acc_norm_stderr": 0.025870646766169136 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.87, "acc_stderr": 0.033799766898963086, "acc_norm": 0.87, "acc_norm_stderr": 0.033799766898963086 }, "harness|hendrycksTest-virology|5": { "acc": 0.5481927710843374, "acc_stderr": 0.03874371556587953, "acc_norm": 0.5481927710843374, "acc_norm_stderr": 0.03874371556587953 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8304093567251462, "acc_stderr": 0.02878210810540171, "acc_norm": 0.8304093567251462, "acc_norm_stderr": 0.02878210810540171 }, "harness|truthfulqa:mc|0": { "mc1": 0.5862913096695227, "mc1_stderr": 0.0172408618120998, "mc2": 0.7390753698182038, "mc2_stderr": 0.014550317819192169 }, "harness|winogrande|5": { "acc": 0.846093133385951, "acc_stderr": 0.010141944523750033 }, "harness|gsm8k|5": { "acc": 0.6762699014404853, "acc_stderr": 0.012888247397371143 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_CultriX__Wernicke-7B-dpo
[ "region:us" ]
2024-02-02T04:22:57+00:00
{"pretty_name": "Evaluation run of CultriX/Wernicke-7B-dpo", "dataset_summary": "Dataset automatically created during the evaluation run of model [CultriX/Wernicke-7B-dpo](https://huggingface.co/CultriX/Wernicke-7B-dpo) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_CultriX__Wernicke-7B-dpo\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-02T04:20:39.434193](https://huggingface.co/datasets/open-llm-leaderboard/details_CultriX__Wernicke-7B-dpo/blob/main/results_2024-02-02T04-20-39.434193.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6575013086727888,\n \"acc_stderr\": 0.03194701943230958,\n \"acc_norm\": 0.6572954814106298,\n \"acc_norm_stderr\": 0.03261251839195054,\n \"mc1\": 0.5862913096695227,\n \"mc1_stderr\": 0.0172408618120998,\n \"mc2\": 0.7390753698182038,\n \"mc2_stderr\": 0.014550317819192169\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.7013651877133106,\n \"acc_stderr\": 0.013374078615068744,\n \"acc_norm\": 0.7184300341296929,\n \"acc_norm_stderr\": 0.013143376735009022\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7081258713403704,\n \"acc_stderr\": 0.004536955796510544,\n \"acc_norm\": 0.8862776339374626,\n \"acc_norm_stderr\": 0.0031682493518893065\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.04072314811876837,\n \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.04072314811876837\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6973684210526315,\n \"acc_stderr\": 0.03738520676119669,\n \"acc_norm\": 0.6973684210526315,\n \"acc_norm_stderr\": 0.03738520676119669\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.63,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7169811320754716,\n \"acc_stderr\": 0.027724236492700914,\n \"acc_norm\": 0.7169811320754716,\n \"acc_norm_stderr\": 0.027724236492700914\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7916666666666666,\n \"acc_stderr\": 0.033961162058453336,\n \"acc_norm\": 0.7916666666666666,\n \"acc_norm_stderr\": 0.033961162058453336\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6705202312138728,\n \"acc_stderr\": 0.03583901754736412,\n \"acc_norm\": 0.6705202312138728,\n \"acc_norm_stderr\": 0.03583901754736412\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4411764705882353,\n \"acc_stderr\": 0.049406356306056595,\n \"acc_norm\": 0.4411764705882353,\n \"acc_norm_stderr\": 0.049406356306056595\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768077,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768077\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.574468085106383,\n \"acc_stderr\": 0.03232146916224468,\n \"acc_norm\": 0.574468085106383,\n \"acc_norm_stderr\": 0.03232146916224468\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.047036043419179864,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.047036043419179864\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5448275862068965,\n \"acc_stderr\": 0.04149886942192117,\n \"acc_norm\": 0.5448275862068965,\n \"acc_norm_stderr\": 0.04149886942192117\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.43915343915343913,\n \"acc_stderr\": 0.025559920550531,\n \"acc_norm\": 0.43915343915343913,\n \"acc_norm_stderr\": 0.025559920550531\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.48412698412698413,\n \"acc_stderr\": 0.04469881854072606,\n \"acc_norm\": 0.48412698412698413,\n \"acc_norm_stderr\": 0.04469881854072606\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7838709677419354,\n \"acc_stderr\": 0.023415293433568525,\n \"acc_norm\": 0.7838709677419354,\n \"acc_norm_stderr\": 0.023415293433568525\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5123152709359606,\n \"acc_stderr\": 0.035169204442208966,\n \"acc_norm\": 0.5123152709359606,\n \"acc_norm_stderr\": 0.035169204442208966\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.03256866661681102,\n \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.03256866661681102\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7828282828282829,\n \"acc_stderr\": 0.029376616484945633,\n \"acc_norm\": 0.7828282828282829,\n \"acc_norm_stderr\": 0.029376616484945633\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9119170984455959,\n \"acc_stderr\": 0.02045374660160103,\n \"acc_norm\": 0.9119170984455959,\n \"acc_norm_stderr\": 0.02045374660160103\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6820512820512821,\n \"acc_stderr\": 0.023610884308927865,\n \"acc_norm\": 0.6820512820512821,\n \"acc_norm_stderr\": 0.023610884308927865\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3592592592592593,\n \"acc_stderr\": 0.029252905927251972,\n \"acc_norm\": 0.3592592592592593,\n \"acc_norm_stderr\": 0.029252905927251972\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6722689075630253,\n \"acc_stderr\": 0.03048991141767323,\n \"acc_norm\": 0.6722689075630253,\n \"acc_norm_stderr\": 0.03048991141767323\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3576158940397351,\n \"acc_stderr\": 0.03913453431177258,\n \"acc_norm\": 0.3576158940397351,\n \"acc_norm_stderr\": 0.03913453431177258\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8422018348623853,\n \"acc_stderr\": 0.01563002297009244,\n \"acc_norm\": 0.8422018348623853,\n \"acc_norm_stderr\": 0.01563002297009244\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5092592592592593,\n \"acc_stderr\": 0.034093869469927006,\n \"acc_norm\": 0.5092592592592593,\n \"acc_norm_stderr\": 0.034093869469927006\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8382352941176471,\n \"acc_stderr\": 0.02584501798692692,\n \"acc_norm\": 0.8382352941176471,\n \"acc_norm_stderr\": 0.02584501798692692\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8185654008438819,\n \"acc_stderr\": 0.02508596114457966,\n \"acc_norm\": 0.8185654008438819,\n \"acc_norm_stderr\": 0.02508596114457966\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.695067264573991,\n \"acc_stderr\": 0.030898610882477515,\n \"acc_norm\": 0.695067264573991,\n \"acc_norm_stderr\": 0.030898610882477515\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8015267175572519,\n \"acc_stderr\": 0.034981493854624714,\n \"acc_norm\": 0.8015267175572519,\n \"acc_norm_stderr\": 0.034981493854624714\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.0401910747255735,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.0401910747255735\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.754601226993865,\n \"acc_stderr\": 0.03380939813943354,\n \"acc_norm\": 0.754601226993865,\n \"acc_norm_stderr\": 0.03380939813943354\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5089285714285714,\n \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.5089285714285714,\n \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7961165048543689,\n \"acc_stderr\": 0.039891398595317706,\n \"acc_norm\": 0.7961165048543689,\n \"acc_norm_stderr\": 0.039891398595317706\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8717948717948718,\n \"acc_stderr\": 0.021901905115073325,\n \"acc_norm\": 0.8717948717948718,\n \"acc_norm_stderr\": 0.021901905115073325\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768079,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768079\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.822477650063857,\n \"acc_stderr\": 0.013664230995834841,\n \"acc_norm\": 0.822477650063857,\n \"acc_norm_stderr\": 0.013664230995834841\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7456647398843931,\n \"acc_stderr\": 0.023445826276545543,\n \"acc_norm\": 0.7456647398843931,\n \"acc_norm_stderr\": 0.023445826276545543\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.40558659217877097,\n \"acc_stderr\": 0.016421670506339178,\n \"acc_norm\": 0.40558659217877097,\n \"acc_norm_stderr\": 0.016421670506339178\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.025646863097137897,\n \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.025646863097137897\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7202572347266881,\n \"acc_stderr\": 0.02549425935069491,\n \"acc_norm\": 0.7202572347266881,\n \"acc_norm_stderr\": 0.02549425935069491\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7623456790123457,\n \"acc_stderr\": 0.02368359183700856,\n \"acc_norm\": 0.7623456790123457,\n \"acc_norm_stderr\": 0.02368359183700856\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.48936170212765956,\n \"acc_stderr\": 0.029820747191422473,\n \"acc_norm\": 0.48936170212765956,\n \"acc_norm_stderr\": 0.029820747191422473\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46284224250325945,\n \"acc_stderr\": 0.012734923579532069,\n \"acc_norm\": 0.46284224250325945,\n \"acc_norm_stderr\": 0.012734923579532069\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6838235294117647,\n \"acc_stderr\": 0.028245687391462923,\n \"acc_norm\": 0.6838235294117647,\n \"acc_norm_stderr\": 0.028245687391462923\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6911764705882353,\n \"acc_stderr\": 0.018690850273595284,\n \"acc_norm\": 0.6911764705882353,\n \"acc_norm_stderr\": 0.018690850273595284\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.04389311454644286,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.04389311454644286\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.726530612244898,\n \"acc_stderr\": 0.028535560337128445,\n \"acc_norm\": 0.726530612244898,\n \"acc_norm_stderr\": 0.028535560337128445\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n \"acc_stderr\": 0.025870646766169136,\n \"acc_norm\": 0.8407960199004975,\n \"acc_norm_stderr\": 0.025870646766169136\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.87,\n \"acc_stderr\": 0.033799766898963086,\n \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.033799766898963086\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5481927710843374,\n \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.5481927710843374,\n \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5862913096695227,\n \"mc1_stderr\": 0.0172408618120998,\n \"mc2\": 0.7390753698182038,\n \"mc2_stderr\": 0.014550317819192169\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.846093133385951,\n \"acc_stderr\": 0.010141944523750033\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6762699014404853,\n \"acc_stderr\": 0.012888247397371143\n }\n}\n```", "repo_url": "https://huggingface.co/CultriX/Wernicke-7B-dpo", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_02T04_20_39.434193", "path": ["**/details_harness|arc:challenge|25_2024-02-02T04-20-39.434193.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-02T04-20-39.434193.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_02T04_20_39.434193", "path": ["**/details_harness|gsm8k|5_2024-02-02T04-20-39.434193.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-02T04-20-39.434193.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_02T04_20_39.434193", "path": ["**/details_harness|hellaswag|10_2024-02-02T04-20-39.434193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-02T04-20-39.434193.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_02T04_20_39.434193", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T04-20-39.434193.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-02T04-20-39.434193.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-02T04-20-39.434193.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T04-20-39.434193.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T04-20-39.434193.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-02T04-20-39.434193.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T04-20-39.434193.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T04-20-39.434193.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T04-20-39.434193.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T04-20-39.434193.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-02T04-20-39.434193.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-02T04-20-39.434193.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T04-20-39.434193.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-02T04-20-39.434193.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T04-20-39.434193.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T04-20-39.434193.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T04-20-39.434193.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-02T04-20-39.434193.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T04-20-39.434193.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T04-20-39.434193.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T04-20-39.434193.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T04-20-39.434193.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T04-20-39.434193.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T04-20-39.434193.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T04-20-39.434193.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T04-20-39.434193.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T04-20-39.434193.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T04-20-39.434193.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T04-20-39.434193.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T04-20-39.434193.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T04-20-39.434193.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T04-20-39.434193.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-02T04-20-39.434193.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T04-20-39.434193.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-02T04-20-39.434193.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T04-20-39.434193.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T04-20-39.434193.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T04-20-39.434193.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-02T04-20-39.434193.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-02T04-20-39.434193.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T04-20-39.434193.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T04-20-39.434193.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T04-20-39.434193.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T04-20-39.434193.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-02T04-20-39.434193.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-02T04-20-39.434193.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-02T04-20-39.434193.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T04-20-39.434193.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-02T04-20-39.434193.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T04-20-39.434193.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T04-20-39.434193.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-02T04-20-39.434193.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-02T04-20-39.434193.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-02T04-20-39.434193.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T04-20-39.434193.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-02T04-20-39.434193.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-02T04-20-39.434193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T04-20-39.434193.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-02T04-20-39.434193.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-02T04-20-39.434193.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T04-20-39.434193.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T04-20-39.434193.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-02T04-20-39.434193.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T04-20-39.434193.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T04-20-39.434193.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T04-20-39.434193.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T04-20-39.434193.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-02T04-20-39.434193.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-02T04-20-39.434193.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T04-20-39.434193.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-02T04-20-39.434193.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T04-20-39.434193.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T04-20-39.434193.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T04-20-39.434193.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-02T04-20-39.434193.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T04-20-39.434193.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T04-20-39.434193.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T04-20-39.434193.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T04-20-39.434193.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T04-20-39.434193.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T04-20-39.434193.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T04-20-39.434193.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T04-20-39.434193.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T04-20-39.434193.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T04-20-39.434193.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T04-20-39.434193.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T04-20-39.434193.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T04-20-39.434193.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T04-20-39.434193.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-02T04-20-39.434193.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T04-20-39.434193.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-02T04-20-39.434193.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T04-20-39.434193.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T04-20-39.434193.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T04-20-39.434193.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-02T04-20-39.434193.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-02T04-20-39.434193.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T04-20-39.434193.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T04-20-39.434193.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T04-20-39.434193.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T04-20-39.434193.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-02T04-20-39.434193.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-02T04-20-39.434193.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-02T04-20-39.434193.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T04-20-39.434193.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-02T04-20-39.434193.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T04-20-39.434193.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T04-20-39.434193.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-02T04-20-39.434193.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-02T04-20-39.434193.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-02T04-20-39.434193.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T04-20-39.434193.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-02T04-20-39.434193.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-02T04-20-39.434193.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_02T04_20_39.434193", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T04-20-39.434193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T04-20-39.434193.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_02T04_20_39.434193", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-02T04-20-39.434193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-02T04-20-39.434193.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_02T04_20_39.434193", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-02T04-20-39.434193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-02T04-20-39.434193.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_02T04_20_39.434193", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T04-20-39.434193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T04-20-39.434193.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_02T04_20_39.434193", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T04-20-39.434193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T04-20-39.434193.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_02T04_20_39.434193", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-02T04-20-39.434193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-02T04-20-39.434193.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_02T04_20_39.434193", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T04-20-39.434193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T04-20-39.434193.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_02T04_20_39.434193", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T04-20-39.434193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T04-20-39.434193.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_02T04_20_39.434193", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T04-20-39.434193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T04-20-39.434193.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_02T04_20_39.434193", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T04-20-39.434193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T04-20-39.434193.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_02T04_20_39.434193", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-02T04-20-39.434193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-02T04-20-39.434193.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_02T04_20_39.434193", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-02T04-20-39.434193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-02T04-20-39.434193.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_02T04_20_39.434193", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T04-20-39.434193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T04-20-39.434193.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_02T04_20_39.434193", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-02T04-20-39.434193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-02T04-20-39.434193.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_02T04_20_39.434193", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T04-20-39.434193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T04-20-39.434193.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_02T04_20_39.434193", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T04-20-39.434193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T04-20-39.434193.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_02T04_20_39.434193", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T04-20-39.434193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T04-20-39.434193.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_02T04_20_39.434193", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-02T04-20-39.434193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-02T04-20-39.434193.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_02T04_20_39.434193", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T04-20-39.434193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T04-20-39.434193.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_02T04_20_39.434193", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T04-20-39.434193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T04-20-39.434193.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_02T04_20_39.434193", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T04-20-39.434193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T04-20-39.434193.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_02T04_20_39.434193", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T04-20-39.434193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T04-20-39.434193.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_02T04_20_39.434193", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T04-20-39.434193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T04-20-39.434193.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_02T04_20_39.434193", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T04-20-39.434193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T04-20-39.434193.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_02T04_20_39.434193", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T04-20-39.434193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T04-20-39.434193.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_02T04_20_39.434193", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T04-20-39.434193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T04-20-39.434193.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_02T04_20_39.434193", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T04-20-39.434193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T04-20-39.434193.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_02T04_20_39.434193", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T04-20-39.434193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T04-20-39.434193.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_02T04_20_39.434193", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T04-20-39.434193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T04-20-39.434193.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_02T04_20_39.434193", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T04-20-39.434193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T04-20-39.434193.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_02T04_20_39.434193", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T04-20-39.434193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T04-20-39.434193.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_02T04_20_39.434193", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T04-20-39.434193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T04-20-39.434193.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_02T04_20_39.434193", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-02T04-20-39.434193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-02T04-20-39.434193.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_02T04_20_39.434193", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T04-20-39.434193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T04-20-39.434193.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_02T04_20_39.434193", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-02T04-20-39.434193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-02T04-20-39.434193.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_02T04_20_39.434193", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T04-20-39.434193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T04-20-39.434193.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_02T04_20_39.434193", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T04-20-39.434193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T04-20-39.434193.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_02T04_20_39.434193", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T04-20-39.434193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T04-20-39.434193.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_02T04_20_39.434193", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-02T04-20-39.434193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-02T04-20-39.434193.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_02T04_20_39.434193", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-02T04-20-39.434193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-02T04-20-39.434193.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_02T04_20_39.434193", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T04-20-39.434193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T04-20-39.434193.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_02T04_20_39.434193", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T04-20-39.434193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T04-20-39.434193.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_02T04_20_39.434193", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T04-20-39.434193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T04-20-39.434193.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_02T04_20_39.434193", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T04-20-39.434193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T04-20-39.434193.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_02T04_20_39.434193", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-02T04-20-39.434193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-02T04-20-39.434193.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_02T04_20_39.434193", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-02T04-20-39.434193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-02T04-20-39.434193.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_02T04_20_39.434193", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-02T04-20-39.434193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-02T04-20-39.434193.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_02T04_20_39.434193", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T04-20-39.434193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T04-20-39.434193.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_02T04_20_39.434193", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-02T04-20-39.434193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-02T04-20-39.434193.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_02T04_20_39.434193", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T04-20-39.434193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T04-20-39.434193.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_02T04_20_39.434193", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T04-20-39.434193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T04-20-39.434193.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_02T04_20_39.434193", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-02T04-20-39.434193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-02T04-20-39.434193.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_02T04_20_39.434193", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-02T04-20-39.434193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-02T04-20-39.434193.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_02T04_20_39.434193", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-02T04-20-39.434193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-02T04-20-39.434193.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_02T04_20_39.434193", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T04-20-39.434193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T04-20-39.434193.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_02T04_20_39.434193", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-02T04-20-39.434193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-02T04-20-39.434193.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_02T04_20_39.434193", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-02T04-20-39.434193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-02T04-20-39.434193.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_02T04_20_39.434193", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-02T04-20-39.434193.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-02T04-20-39.434193.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_02T04_20_39.434193", "path": ["**/details_harness|winogrande|5_2024-02-02T04-20-39.434193.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-02T04-20-39.434193.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_02T04_20_39.434193", "path": ["results_2024-02-02T04-20-39.434193.parquet"]}, {"split": "latest", "path": ["results_2024-02-02T04-20-39.434193.parquet"]}]}]}
2024-02-02T04:23:25+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of CultriX/Wernicke-7B-dpo Dataset automatically created during the evaluation run of model CultriX/Wernicke-7B-dpo on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-02-02T04:20:39.434193(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of CultriX/Wernicke-7B-dpo\n\n\n\nDataset automatically created during the evaluation run of model CultriX/Wernicke-7B-dpo on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-02-02T04:20:39.434193(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of CultriX/Wernicke-7B-dpo\n\n\n\nDataset automatically created during the evaluation run of model CultriX/Wernicke-7B-dpo on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-02-02T04:20:39.434193(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
c58073cac685dd8d0f44a13b2f0be4aa46875bac
A balanced version of scikit_adult_census_income.
jameskrw/balanced_scikit_adult_census_income
[ "task_categories:text-classification", "size_categories:10K<n<100K", "language:en", "license:apache-2.0", "finance", "region:us" ]
2024-02-02T04:32:44+00:00
{"language": ["en"], "license": "apache-2.0", "size_categories": ["10K<n<100K"], "task_categories": ["text-classification"], "tags": ["finance"]}
2024-02-02T05:02:47+00:00
[]
[ "en" ]
TAGS #task_categories-text-classification #size_categories-10K<n<100K #language-English #license-apache-2.0 #finance #region-us
A balanced version of scikit_adult_census_income.
[]
[ "TAGS\n#task_categories-text-classification #size_categories-10K<n<100K #language-English #license-apache-2.0 #finance #region-us \n" ]